Test Report: Docker_Linux_containerd_arm64 21978

                    
                      c78c82fa8bc5e05550c6fccb0bebb9cb966c725e:2025-11-24:42489
                    
                

Test fail (32/320)

Order failed test Duration
53 TestAddons/parallel/LocalPath 231.34
63 TestDockerEnvContainerd 51.16
99 TestFunctional/parallel/DashboardCmd 302.68
106 TestFunctional/parallel/ServiceCmdConnect 603.67
108 TestFunctional/parallel/PersistentVolumeClaim 249.67
146 TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup 240.83
152 TestFunctional/parallel/TunnelCmd/serial/AccessDirect 87.3
170 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/StartWithProxy 510.27
172 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/SoftStart 369.25
174 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/KubectlGetPods 2.22
184 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmd 2.29
185 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmdDirectly 2.43
186 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ExtraConfig 737.68
187 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ComponentHealth 2.24
190 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/InvalidService 0.06
193 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DashboardCmd 1.73
196 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/StatusCmd 3.18
200 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmdConnect 2.44
202 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim 241.65
212 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NodeLabels 3.24
236 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/RunSecondTunnel 0.47
239 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/WaitService/Setup 0.09
240 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/AccessDirect 84.17
245 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/DeployApp 0.06
246 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/List 0.28
247 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/JSONOutput 0.28
248 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/HTTPS 0.27
249 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/Format 0.26
250 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/URL 0.3
254 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MountCmd/any-port 2.41
357 TestKubernetesUpgrade 801.64
447 TestStartStop/group/no-preload/serial/UserAppExistsAfterStop 7200.067
x
+
TestAddons/parallel/LocalPath (231.34s)

                                                
                                                
=== RUN   TestAddons/parallel/LocalPath
=== PAUSE TestAddons/parallel/LocalPath

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/LocalPath
addons_test.go:949: (dbg) Run:  kubectl --context addons-674149 apply -f testdata/storage-provisioner-rancher/pvc.yaml
addons_test.go:955: (dbg) Run:  kubectl --context addons-674149 apply -f testdata/storage-provisioner-rancher/pod.yaml
addons_test.go:959: (dbg) TestAddons/parallel/LocalPath: waiting 5m0s for pvc "test-pvc" in namespace "default" ...
helpers_test.go:402: (dbg) Run:  kubectl --context addons-674149 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:402: (dbg) Run:  kubectl --context addons-674149 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:402: (dbg) Run:  kubectl --context addons-674149 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:402: (dbg) Run:  kubectl --context addons-674149 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:402: (dbg) Run:  kubectl --context addons-674149 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:402: (dbg) Run:  kubectl --context addons-674149 get pvc test-pvc -o jsonpath={.status.phase} -n default
addons_test.go:962: (dbg) TestAddons/parallel/LocalPath: waiting 3m0s for pods matching "run=test-local-path" in namespace "default" ...
helpers_test.go:352: "test-local-path" [db57d505-80a3-4fb4-b2fe-b9a4b8812a33] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:337: TestAddons/parallel/LocalPath: WARNING: pod list for "default" "run=test-local-path" returned: client rate limiter Wait returned an error: context deadline exceeded
addons_test.go:962: ***** TestAddons/parallel/LocalPath: pod "run=test-local-path" failed to start within 3m0s: context deadline exceeded ****
addons_test.go:962: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p addons-674149 -n addons-674149
addons_test.go:962: TestAddons/parallel/LocalPath: showing logs for failed pods as of 2025-11-24 08:51:02.637427127 +0000 UTC m=+478.004733689
addons_test.go:962: (dbg) Run:  kubectl --context addons-674149 describe po test-local-path -n default
addons_test.go:962: (dbg) kubectl --context addons-674149 describe po test-local-path -n default:
Name:             test-local-path
Namespace:        default
Priority:         0
Service Account:  default
Node:             addons-674149/192.168.49.2
Start Time:       Mon, 24 Nov 2025 08:48:02 +0000
Labels:           run=test-local-path
Annotations:      <none>
Status:           Pending
IP:               10.244.0.35
IPs:
IP:  10.244.0.35
Containers:
busybox:
Container ID:  
Image:         busybox:stable
Image ID:      
Port:          <none>
Host Port:     <none>
Command:
sh
-c
echo 'local-path-provisioner' > /test/file1
State:          Waiting
Reason:       ImagePullBackOff
Ready:          False
Restart Count:  0
Environment:    <none>
Mounts:
/test from data (rw)
/var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-l7fl2 (ro)
Conditions:
Type                        Status
PodReadyToStartContainers   True 
Initialized                 True 
Ready                       False 
ContainersReady             False 
PodScheduled                True 
Volumes:
data:
Type:       PersistentVolumeClaim (a reference to a PersistentVolumeClaim in the same namespace)
ClaimName:  test-pvc
ReadOnly:   false
kube-api-access-l7fl2:
Type:                    Projected (a volume that contains injected data from multiple sources)
TokenExpirationSeconds:  3607
ConfigMapName:           kube-root-ca.crt
Optional:                false
DownwardAPI:             true
QoS Class:                   BestEffort
Node-Selectors:              <none>
Tolerations:                 node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
Events:
Type     Reason     Age                  From               Message
----     ------     ----                 ----               -------
Normal   Scheduled  3m                   default-scheduler  Successfully assigned default/test-local-path to addons-674149
Warning  Failed     95s (x4 over 2m59s)  kubelet            Failed to pull image "busybox:stable": failed to pull and unpack image "docker.io/library/busybox:stable": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/busybox/manifests/sha256:355b3a1bf5609da364166913878a8508d4ba30572d02020a97028c75477e24ff: 429 Too Many Requests
toomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit
Normal   BackOff  17s (x10 over 2m59s)  kubelet  Back-off pulling image "busybox:stable"
Warning  Failed   17s (x10 over 2m59s)  kubelet  Error: ImagePullBackOff
Normal   Pulling  5s (x5 over 3m)       kubelet  Pulling image "busybox:stable"
Warning  Failed   5s (x5 over 2m59s)    kubelet  Error: ErrImagePull
Warning  Failed   5s                    kubelet  Failed to pull image "busybox:stable": failed to pull and unpack image "docker.io/library/busybox:stable": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/busybox/manifests/sha256:079b4a73854a059a2073c6e1a031b17fcbf23a47c6c59ae760d78045199e403c: 429 Too Many Requests
toomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit
addons_test.go:962: (dbg) Run:  kubectl --context addons-674149 logs test-local-path -n default
addons_test.go:962: (dbg) Non-zero exit: kubectl --context addons-674149 logs test-local-path -n default: exit status 1 (98.955829ms)

                                                
                                                
** stderr ** 
	Error from server (BadRequest): container "busybox" in pod "test-local-path" is waiting to start: trying and failing to pull image

                                                
                                                
** /stderr **
addons_test.go:962: kubectl --context addons-674149 logs test-local-path -n default: exit status 1
addons_test.go:963: failed waiting for test-local-path pod: run=test-local-path within 3m0s: context deadline exceeded
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestAddons/parallel/LocalPath]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestAddons/parallel/LocalPath]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect addons-674149
helpers_test.go:243: (dbg) docker inspect addons-674149:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "6f50b0c7a31b191decda1323dd6df71c5acf8a498ff595c68574daed6bd4cefc",
	        "Created": "2025-11-24T08:43:50.987696812Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 1655908,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-11-24T08:43:51.054729643Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:572c983e466f1f784136812eef5cc59ac623db764bc7704d3676c4643993fd08",
	        "ResolvConfPath": "/var/lib/docker/containers/6f50b0c7a31b191decda1323dd6df71c5acf8a498ff595c68574daed6bd4cefc/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/6f50b0c7a31b191decda1323dd6df71c5acf8a498ff595c68574daed6bd4cefc/hostname",
	        "HostsPath": "/var/lib/docker/containers/6f50b0c7a31b191decda1323dd6df71c5acf8a498ff595c68574daed6bd4cefc/hosts",
	        "LogPath": "/var/lib/docker/containers/6f50b0c7a31b191decda1323dd6df71c5acf8a498ff595c68574daed6bd4cefc/6f50b0c7a31b191decda1323dd6df71c5acf8a498ff595c68574daed6bd4cefc-json.log",
	        "Name": "/addons-674149",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "addons-674149:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "addons-674149",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "6f50b0c7a31b191decda1323dd6df71c5acf8a498ff595c68574daed6bd4cefc",
	                "LowerDir": "/var/lib/docker/overlay2/70cd6628664ed47f93a9a39be390103f5ce12b4880b31ad51cbe5445e0ac7f13-init/diff:/var/lib/docker/overlay2/bf294786c7ade59cb761c7bdcc27045362c3779b00a569b685443f34ade17737/diff",
	                "MergedDir": "/var/lib/docker/overlay2/70cd6628664ed47f93a9a39be390103f5ce12b4880b31ad51cbe5445e0ac7f13/merged",
	                "UpperDir": "/var/lib/docker/overlay2/70cd6628664ed47f93a9a39be390103f5ce12b4880b31ad51cbe5445e0ac7f13/diff",
	                "WorkDir": "/var/lib/docker/overlay2/70cd6628664ed47f93a9a39be390103f5ce12b4880b31ad51cbe5445e0ac7f13/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "addons-674149",
	                "Source": "/var/lib/docker/volumes/addons-674149/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "addons-674149",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8443/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "addons-674149",
	                "name.minikube.sigs.k8s.io": "addons-674149",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "177b2f9eb29788fba610c25ac03d8798d8e886b4fde029951b300b6a731bb71e",
	            "SandboxKey": "/var/run/docker/netns/177b2f9eb297",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34664"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34665"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34668"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34666"
	                    }
	                ],
	                "8443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34667"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "addons-674149": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "d6:13:8b:c0:f9:16",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "6469503e75d42721f29e6966939434398a736884d93d77ec539db8f12656f9b8",
	                    "EndpointID": "b063cc5970cf5254a5d1a359be9b8b4c1d26d2859d4dfad5ac487c55bb241c6f",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "addons-674149",
	                        "6f50b0c7a31b"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p addons-674149 -n addons-674149
helpers_test.go:252: <<< TestAddons/parallel/LocalPath FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestAddons/parallel/LocalPath]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p addons-674149 logs -n 25
helpers_test.go:255: (dbg) Done: out/minikube-linux-arm64 -p addons-674149 logs -n 25: (1.283373952s)
helpers_test.go:260: TestAddons/parallel/LocalPath logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬────────────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                                                                                                                                                                                      ARGS                                                                                                                                                                                                                                      │        PROFILE         │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼────────────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ delete  │ -p download-docker-750165                                                                                                                                                                                                                                                                                                                                                                                                                                                      │ download-docker-750165 │ jenkins │ v1.37.0 │ 24 Nov 25 08:43 UTC │ 24 Nov 25 08:43 UTC │
	│ start   │ --download-only -p binary-mirror-539266 --alsologtostderr --binary-mirror http://127.0.0.1:44655 --driver=docker  --container-runtime=containerd                                                                                                                                                                                                                                                                                                                               │ binary-mirror-539266   │ jenkins │ v1.37.0 │ 24 Nov 25 08:43 UTC │                     │
	│ delete  │ -p binary-mirror-539266                                                                                                                                                                                                                                                                                                                                                                                                                                                        │ binary-mirror-539266   │ jenkins │ v1.37.0 │ 24 Nov 25 08:43 UTC │ 24 Nov 25 08:43 UTC │
	│ addons  │ disable dashboard -p addons-674149                                                                                                                                                                                                                                                                                                                                                                                                                                             │ addons-674149          │ jenkins │ v1.37.0 │ 24 Nov 25 08:43 UTC │                     │
	│ addons  │ enable dashboard -p addons-674149                                                                                                                                                                                                                                                                                                                                                                                                                                              │ addons-674149          │ jenkins │ v1.37.0 │ 24 Nov 25 08:43 UTC │                     │
	│ start   │ -p addons-674149 --wait=true --memory=4096 --alsologtostderr --addons=registry --addons=registry-creds --addons=metrics-server --addons=volumesnapshots --addons=csi-hostpath-driver --addons=gcp-auth --addons=cloud-spanner --addons=inspektor-gadget --addons=nvidia-device-plugin --addons=yakd --addons=volcano --addons=amd-gpu-device-plugin --driver=docker  --container-runtime=containerd --addons=ingress --addons=ingress-dns --addons=storage-provisioner-rancher │ addons-674149          │ jenkins │ v1.37.0 │ 24 Nov 25 08:43 UTC │ 24 Nov 25 08:46 UTC │
	│ addons  │ addons-674149 addons disable volcano --alsologtostderr -v=1                                                                                                                                                                                                                                                                                                                                                                                                                    │ addons-674149          │ jenkins │ v1.37.0 │ 24 Nov 25 08:46 UTC │ 24 Nov 25 08:46 UTC │
	│ addons  │ addons-674149 addons disable gcp-auth --alsologtostderr -v=1                                                                                                                                                                                                                                                                                                                                                                                                                   │ addons-674149          │ jenkins │ v1.37.0 │ 24 Nov 25 08:46 UTC │ 24 Nov 25 08:47 UTC │
	│ addons  │ enable headlamp -p addons-674149 --alsologtostderr -v=1                                                                                                                                                                                                                                                                                                                                                                                                                        │ addons-674149          │ jenkins │ v1.37.0 │ 24 Nov 25 08:47 UTC │ 24 Nov 25 08:47 UTC │
	│ addons  │ addons-674149 addons disable headlamp --alsologtostderr -v=1                                                                                                                                                                                                                                                                                                                                                                                                                   │ addons-674149          │ jenkins │ v1.37.0 │ 24 Nov 25 08:47 UTC │ 24 Nov 25 08:47 UTC │
	│ ip      │ addons-674149 ip                                                                                                                                                                                                                                                                                                                                                                                                                                                               │ addons-674149          │ jenkins │ v1.37.0 │ 24 Nov 25 08:47 UTC │ 24 Nov 25 08:47 UTC │
	│ addons  │ addons-674149 addons disable registry --alsologtostderr -v=1                                                                                                                                                                                                                                                                                                                                                                                                                   │ addons-674149          │ jenkins │ v1.37.0 │ 24 Nov 25 08:47 UTC │ 24 Nov 25 08:47 UTC │
	│ addons  │ addons-674149 addons disable metrics-server --alsologtostderr -v=1                                                                                                                                                                                                                                                                                                                                                                                                             │ addons-674149          │ jenkins │ v1.37.0 │ 24 Nov 25 08:47 UTC │ 24 Nov 25 08:47 UTC │
	│ addons  │ addons-674149 addons disable inspektor-gadget --alsologtostderr -v=1                                                                                                                                                                                                                                                                                                                                                                                                           │ addons-674149          │ jenkins │ v1.37.0 │ 24 Nov 25 08:47 UTC │ 24 Nov 25 08:47 UTC │
	│ addons  │ addons-674149 addons disable volumesnapshots --alsologtostderr -v=1                                                                                                                                                                                                                                                                                                                                                                                                            │ addons-674149          │ jenkins │ v1.37.0 │ 24 Nov 25 08:47 UTC │ 24 Nov 25 08:47 UTC │
	│ ssh     │ addons-674149 ssh curl -s http://127.0.0.1/ -H 'Host: nginx.example.com'                                                                                                                                                                                                                                                                                                                                                                                                       │ addons-674149          │ jenkins │ v1.37.0 │ 24 Nov 25 08:47 UTC │ 24 Nov 25 08:47 UTC │
	│ ip      │ addons-674149 ip                                                                                                                                                                                                                                                                                                                                                                                                                                                               │ addons-674149          │ jenkins │ v1.37.0 │ 24 Nov 25 08:47 UTC │ 24 Nov 25 08:47 UTC │
	│ addons  │ addons-674149 addons disable csi-hostpath-driver --alsologtostderr -v=1                                                                                                                                                                                                                                                                                                                                                                                                        │ addons-674149          │ jenkins │ v1.37.0 │ 24 Nov 25 08:47 UTC │ 24 Nov 25 08:47 UTC │
	│ addons  │ addons-674149 addons disable ingress-dns --alsologtostderr -v=1                                                                                                                                                                                                                                                                                                                                                                                                                │ addons-674149          │ jenkins │ v1.37.0 │ 24 Nov 25 08:47 UTC │ 24 Nov 25 08:47 UTC │
	│ addons  │ addons-674149 addons disable ingress --alsologtostderr -v=1                                                                                                                                                                                                                                                                                                                                                                                                                    │ addons-674149          │ jenkins │ v1.37.0 │ 24 Nov 25 08:47 UTC │ 24 Nov 25 08:47 UTC │
	│ addons  │ configure registry-creds -f ./testdata/addons_testconfig.json -p addons-674149                                                                                                                                                                                                                                                                                                                                                                                                 │ addons-674149          │ jenkins │ v1.37.0 │ 24 Nov 25 08:47 UTC │ 24 Nov 25 08:47 UTC │
	│ addons  │ addons-674149 addons disable registry-creds --alsologtostderr -v=1                                                                                                                                                                                                                                                                                                                                                                                                             │ addons-674149          │ jenkins │ v1.37.0 │ 24 Nov 25 08:47 UTC │ 24 Nov 25 08:47 UTC │
	│ addons  │ addons-674149 addons disable yakd --alsologtostderr -v=1                                                                                                                                                                                                                                                                                                                                                                                                                       │ addons-674149          │ jenkins │ v1.37.0 │ 24 Nov 25 08:48 UTC │ 24 Nov 25 08:48 UTC │
	│ addons  │ addons-674149 addons disable nvidia-device-plugin --alsologtostderr -v=1                                                                                                                                                                                                                                                                                                                                                                                                       │ addons-674149          │ jenkins │ v1.37.0 │ 24 Nov 25 08:48 UTC │ 24 Nov 25 08:48 UTC │
	│ addons  │ addons-674149 addons disable cloud-spanner --alsologtostderr -v=1                                                                                                                                                                                                                                                                                                                                                                                                              │ addons-674149          │ jenkins │ v1.37.0 │ 24 Nov 25 08:48 UTC │ 24 Nov 25 08:48 UTC │
	└─────────┴───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴────────────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/11/24 08:43:26
	Running on machine: ip-172-31-30-239
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1124 08:43:26.168772 1655502 out.go:360] Setting OutFile to fd 1 ...
	I1124 08:43:26.168897 1655502 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 08:43:26.168909 1655502 out.go:374] Setting ErrFile to fd 2...
	I1124 08:43:26.168915 1655502 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 08:43:26.169181 1655502 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
	I1124 08:43:26.169637 1655502 out.go:368] Setting JSON to false
	I1124 08:43:26.170486 1655502 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":26736,"bootTime":1763947071,"procs":145,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1124 08:43:26.170560 1655502 start.go:143] virtualization:  
	I1124 08:43:26.173968 1655502 out.go:179] * [addons-674149] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1124 08:43:26.177895 1655502 out.go:179]   - MINIKUBE_LOCATION=21978
	I1124 08:43:26.177993 1655502 notify.go:221] Checking for updates...
	I1124 08:43:26.183716 1655502 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1124 08:43:26.186597 1655502 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 08:43:26.189287 1655502 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21978-1652607/.minikube
	I1124 08:43:26.192087 1655502 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1124 08:43:26.194991 1655502 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1124 08:43:26.198023 1655502 driver.go:422] Setting default libvirt URI to qemu:///system
	I1124 08:43:26.227945 1655502 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1124 08:43:26.228075 1655502 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 08:43:26.282970 1655502 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:1 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:29 OomKillDisable:true NGoroutines:47 SystemTime:2025-11-24 08:43:26.274115039 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 08:43:26.283071 1655502 docker.go:319] overlay module found
	I1124 08:43:26.286170 1655502 out.go:179] * Using the docker driver based on user configuration
	I1124 08:43:26.288892 1655502 start.go:309] selected driver: docker
	I1124 08:43:26.288908 1655502 start.go:927] validating driver "docker" against <nil>
	I1124 08:43:26.288921 1655502 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1124 08:43:26.289611 1655502 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 08:43:26.347499 1655502 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:1 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:29 OomKillDisable:true NGoroutines:47 SystemTime:2025-11-24 08:43:26.338764384 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 08:43:26.347667 1655502 start_flags.go:327] no existing cluster config was found, will generate one from the flags 
	I1124 08:43:26.347895 1655502 start_flags.go:992] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1124 08:43:26.350743 1655502 out.go:179] * Using Docker driver with root privileges
	I1124 08:43:26.353520 1655502 cni.go:84] Creating CNI manager for ""
	I1124 08:43:26.353605 1655502 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1124 08:43:26.353619 1655502 start_flags.go:336] Found "CNI" CNI - setting NetworkPlugin=cni
	I1124 08:43:26.353696 1655502 start.go:353] cluster config:
	{Name:addons-674149 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.2 ClusterName:addons-674149 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime
:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.34.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgen
tPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 08:43:26.358675 1655502 out.go:179] * Starting "addons-674149" primary control-plane node in "addons-674149" cluster
	I1124 08:43:26.361577 1655502 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1124 08:43:26.364512 1655502 out.go:179] * Pulling base image v0.0.48-1763789673-21948 ...
	I1124 08:43:26.367406 1655502 preload.go:188] Checking if preload exists for k8s version v1.34.2 and runtime containerd
	I1124 08:43:26.367459 1655502 preload.go:203] Found local preload: /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.2-containerd-overlay2-arm64.tar.lz4
	I1124 08:43:26.367477 1655502 cache.go:65] Caching tarball of preloaded images
	I1124 08:43:26.367487 1655502 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f in local docker daemon
	I1124 08:43:26.367578 1655502 preload.go:238] Found /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.2-containerd-overlay2-arm64.tar.lz4 in cache, skipping download
	I1124 08:43:26.367588 1655502 cache.go:68] Finished verifying existence of preloaded tar for v1.34.2 on containerd
	I1124 08:43:26.367927 1655502 profile.go:143] Saving config to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/config.json ...
	I1124 08:43:26.367959 1655502 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/config.json: {Name:mk5494747f42686b676173483e93291d79fbedfa Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 08:43:26.382740 1655502 cache.go:163] Downloading gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f to local cache
	I1124 08:43:26.382879 1655502 image.go:65] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f in local cache directory
	I1124 08:43:26.382918 1655502 image.go:68] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f in local cache directory, skipping pull
	I1124 08:43:26.382932 1655502 image.go:137] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f exists in cache, skipping pull
	I1124 08:43:26.382943 1655502 cache.go:166] successfully saved gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f as a tarball
	I1124 08:43:26.382948 1655502 cache.go:176] Loading gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f from local cache
	I1124 08:43:44.357535 1655502 cache.go:178] successfully loaded and using gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f from cached tarball
	I1124 08:43:44.357575 1655502 cache.go:243] Successfully downloaded all kic artifacts
	I1124 08:43:44.357614 1655502 start.go:360] acquireMachinesLock for addons-674149: {Name:mk3ddec554c48a465f45c9b7924bd2a9c88a18df Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 08:43:44.357736 1655502 start.go:364] duration metric: took 96.338µs to acquireMachinesLock for "addons-674149"
	I1124 08:43:44.357768 1655502 start.go:93] Provisioning new machine with config: &{Name:addons-674149 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.2 ClusterName:addons-674149 Namespace:default APIServerHAVIP: APIServerName:min
ikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.34.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFi
rmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.34.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1124 08:43:44.357850 1655502 start.go:125] createHost starting for "" (driver="docker")
	I1124 08:43:44.361235 1655502 out.go:252] * Creating docker container (CPUs=2, Memory=4096MB) ...
	I1124 08:43:44.361501 1655502 start.go:159] libmachine.API.Create for "addons-674149" (driver="docker")
	I1124 08:43:44.361556 1655502 client.go:173] LocalClient.Create starting
	I1124 08:43:44.361688 1655502 main.go:143] libmachine: Creating CA: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem
	I1124 08:43:44.861483 1655502 main.go:143] libmachine: Creating client certificate: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem
	I1124 08:43:44.974089 1655502 cli_runner.go:164] Run: docker network inspect addons-674149 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	W1124 08:43:44.989572 1655502 cli_runner.go:211] docker network inspect addons-674149 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}" returned with exit code 1
	I1124 08:43:44.989651 1655502 network_create.go:284] running [docker network inspect addons-674149] to gather additional debugging logs...
	I1124 08:43:44.989673 1655502 cli_runner.go:164] Run: docker network inspect addons-674149
	W1124 08:43:45.037611 1655502 cli_runner.go:211] docker network inspect addons-674149 returned with exit code 1
	I1124 08:43:45.037642 1655502 network_create.go:287] error running [docker network inspect addons-674149]: docker network inspect addons-674149: exit status 1
	stdout:
	[]
	
	stderr:
	Error response from daemon: network addons-674149 not found
	I1124 08:43:45.037660 1655502 network_create.go:289] output of [docker network inspect addons-674149]: -- stdout --
	[]
	
	-- /stdout --
	** stderr ** 
	Error response from daemon: network addons-674149 not found
	
	** /stderr **
	I1124 08:43:45.037813 1655502 cli_runner.go:164] Run: docker network inspect bridge --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1124 08:43:45.078935 1655502 network.go:206] using free private subnet 192.168.49.0/24: &{IP:192.168.49.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.49.0/24 Gateway:192.168.49.1 ClientMin:192.168.49.2 ClientMax:192.168.49.254 Broadcast:192.168.49.255 IsPrivate:true Interface:{IfaceName: IfaceIPv4: IfaceMTU:0 IfaceMAC:} reservation:0x40019ea7f0}
	I1124 08:43:45.078984 1655502 network_create.go:124] attempt to create docker network addons-674149 192.168.49.0/24 with gateway 192.168.49.1 and MTU of 1500 ...
	I1124 08:43:45.079048 1655502 cli_runner.go:164] Run: docker network create --driver=bridge --subnet=192.168.49.0/24 --gateway=192.168.49.1 -o --ip-masq -o --icc -o com.docker.network.driver.mtu=1500 --label=created_by.minikube.sigs.k8s.io=true --label=name.minikube.sigs.k8s.io=addons-674149 addons-674149
	I1124 08:43:45.187211 1655502 network_create.go:108] docker network addons-674149 192.168.49.0/24 created
	I1124 08:43:45.187260 1655502 kic.go:121] calculated static IP "192.168.49.2" for the "addons-674149" container
	I1124 08:43:45.187353 1655502 cli_runner.go:164] Run: docker ps -a --format {{.Names}}
	I1124 08:43:45.212683 1655502 cli_runner.go:164] Run: docker volume create addons-674149 --label name.minikube.sigs.k8s.io=addons-674149 --label created_by.minikube.sigs.k8s.io=true
	I1124 08:43:45.245044 1655502 oci.go:103] Successfully created a docker volume addons-674149
	I1124 08:43:45.245204 1655502 cli_runner.go:164] Run: docker run --rm --name addons-674149-preload-sidecar --label created_by.minikube.sigs.k8s.io=true --label name.minikube.sigs.k8s.io=addons-674149 --entrypoint /usr/bin/test -v addons-674149:/var gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f -d /var/lib
	I1124 08:43:46.992280 1655502 cli_runner.go:217] Completed: docker run --rm --name addons-674149-preload-sidecar --label created_by.minikube.sigs.k8s.io=true --label name.minikube.sigs.k8s.io=addons-674149 --entrypoint /usr/bin/test -v addons-674149:/var gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f -d /var/lib: (1.747030958s)
	I1124 08:43:46.992312 1655502 oci.go:107] Successfully prepared a docker volume addons-674149
	I1124 08:43:46.992366 1655502 preload.go:188] Checking if preload exists for k8s version v1.34.2 and runtime containerd
	I1124 08:43:46.992379 1655502 kic.go:194] Starting extracting preloaded images to volume ...
	I1124 08:43:46.992441 1655502 cli_runner.go:164] Run: docker run --rm --entrypoint /usr/bin/tar -v /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.2-containerd-overlay2-arm64.tar.lz4:/preloaded.tar:ro -v addons-674149:/extractDir gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f -I lz4 -xf /preloaded.tar -C /extractDir
	I1124 08:43:50.919339 1655502 cli_runner.go:217] Completed: docker run --rm --entrypoint /usr/bin/tar -v /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.2-containerd-overlay2-arm64.tar.lz4:/preloaded.tar:ro -v addons-674149:/extractDir gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f -I lz4 -xf /preloaded.tar -C /extractDir: (3.926852686s)
	I1124 08:43:50.919376 1655502 kic.go:203] duration metric: took 3.926991987s to extract preloaded images to volume ...
	W1124 08:43:50.919521 1655502 cgroups_linux.go:77] Your kernel does not support swap limit capabilities or the cgroup is not mounted.
	I1124 08:43:50.919635 1655502 cli_runner.go:164] Run: docker info --format "'{{json .SecurityOptions}}'"
	I1124 08:43:50.973357 1655502 cli_runner.go:164] Run: docker run -d -t --privileged --security-opt seccomp=unconfined --tmpfs /tmp --tmpfs /run -v /lib/modules:/lib/modules:ro --hostname addons-674149 --name addons-674149 --label created_by.minikube.sigs.k8s.io=true --label name.minikube.sigs.k8s.io=addons-674149 --label role.minikube.sigs.k8s.io= --label mode.minikube.sigs.k8s.io=addons-674149 --network addons-674149 --ip 192.168.49.2 --volume addons-674149:/var --security-opt apparmor=unconfined --memory=4096mb --cpus=2 -e container=docker --expose 8443 --publish=127.0.0.1::8443 --publish=127.0.0.1::22 --publish=127.0.0.1::2376 --publish=127.0.0.1::5000 --publish=127.0.0.1::32443 gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f
	I1124 08:43:51.258406 1655502 cli_runner.go:164] Run: docker container inspect addons-674149 --format={{.State.Running}}
	I1124 08:43:51.284025 1655502 cli_runner.go:164] Run: docker container inspect addons-674149 --format={{.State.Status}}
	I1124 08:43:51.304668 1655502 cli_runner.go:164] Run: docker exec addons-674149 stat /var/lib/dpkg/alternatives/iptables
	I1124 08:43:51.358166 1655502 oci.go:144] the created container "addons-674149" has a running status.
	I1124 08:43:51.358196 1655502 kic.go:225] Creating ssh key for kic: /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/addons-674149/id_rsa...
	I1124 08:43:51.982237 1655502 kic_runner.go:191] docker (temp): /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/addons-674149/id_rsa.pub --> /home/docker/.ssh/authorized_keys (381 bytes)
	I1124 08:43:52.007950 1655502 cli_runner.go:164] Run: docker container inspect addons-674149 --format={{.State.Status}}
	I1124 08:43:52.025519 1655502 kic_runner.go:93] Run: chown docker:docker /home/docker/.ssh/authorized_keys
	I1124 08:43:52.025545 1655502 kic_runner.go:114] Args: [docker exec --privileged addons-674149 chown docker:docker /home/docker/.ssh/authorized_keys]
	I1124 08:43:52.067757 1655502 cli_runner.go:164] Run: docker container inspect addons-674149 --format={{.State.Status}}
	I1124 08:43:52.084877 1655502 machine.go:94] provisionDockerMachine start ...
	I1124 08:43:52.084987 1655502 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-674149
	I1124 08:43:52.102189 1655502 main.go:143] libmachine: Using SSH client type: native
	I1124 08:43:52.102571 1655502 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34664 <nil> <nil>}
	I1124 08:43:52.102586 1655502 main.go:143] libmachine: About to run SSH command:
	hostname
	I1124 08:43:52.103106 1655502 main.go:143] libmachine: Error dialing TCP: ssh: handshake failed: read tcp 127.0.0.1:47570->127.0.0.1:34664: read: connection reset by peer
	I1124 08:43:55.253938 1655502 main.go:143] libmachine: SSH cmd err, output: <nil>: addons-674149
	
	I1124 08:43:55.253975 1655502 ubuntu.go:182] provisioning hostname "addons-674149"
	I1124 08:43:55.254043 1655502 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-674149
	I1124 08:43:55.271246 1655502 main.go:143] libmachine: Using SSH client type: native
	I1124 08:43:55.271571 1655502 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34664 <nil> <nil>}
	I1124 08:43:55.271589 1655502 main.go:143] libmachine: About to run SSH command:
	sudo hostname addons-674149 && echo "addons-674149" | sudo tee /etc/hostname
	I1124 08:43:55.427401 1655502 main.go:143] libmachine: SSH cmd err, output: <nil>: addons-674149
	
	I1124 08:43:55.427485 1655502 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-674149
	I1124 08:43:55.445572 1655502 main.go:143] libmachine: Using SSH client type: native
	I1124 08:43:55.445888 1655502 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34664 <nil> <nil>}
	I1124 08:43:55.445909 1655502 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\saddons-674149' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 addons-674149/g' /etc/hosts;
				else 
					echo '127.0.1.1 addons-674149' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1124 08:43:55.595323 1655502 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1124 08:43:55.595391 1655502 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/21978-1652607/.minikube CaCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/21978-1652607/.minikube}
	I1124 08:43:55.595433 1655502 ubuntu.go:190] setting up certificates
	I1124 08:43:55.595461 1655502 provision.go:84] configureAuth start
	I1124 08:43:55.595551 1655502 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" addons-674149
	I1124 08:43:55.612761 1655502 provision.go:143] copyHostCerts
	I1124 08:43:55.612848 1655502 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem (1078 bytes)
	I1124 08:43:55.612962 1655502 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem (1123 bytes)
	I1124 08:43:55.613019 1655502 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem (1679 bytes)
	I1124 08:43:55.613062 1655502 provision.go:117] generating server cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem org=jenkins.addons-674149 san=[127.0.0.1 192.168.49.2 addons-674149 localhost minikube]
	I1124 08:43:56.302507 1655502 provision.go:177] copyRemoteCerts
	I1124 08:43:56.302600 1655502 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1124 08:43:56.302649 1655502 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-674149
	I1124 08:43:56.319817 1655502 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34664 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/addons-674149/id_rsa Username:docker}
	I1124 08:43:56.426547 1655502 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1124 08:43:56.444294 1655502 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I1124 08:43:56.462218 1655502 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I1124 08:43:56.479746 1655502 provision.go:87] duration metric: took 884.237359ms to configureAuth
	I1124 08:43:56.479775 1655502 ubuntu.go:206] setting minikube options for container-runtime
	I1124 08:43:56.479973 1655502 config.go:182] Loaded profile config "addons-674149": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
	I1124 08:43:56.479986 1655502 machine.go:97] duration metric: took 4.395087905s to provisionDockerMachine
	I1124 08:43:56.479994 1655502 client.go:176] duration metric: took 12.118425719s to LocalClient.Create
	I1124 08:43:56.480023 1655502 start.go:167] duration metric: took 12.11852337s to libmachine.API.Create "addons-674149"
	I1124 08:43:56.480034 1655502 start.go:293] postStartSetup for "addons-674149" (driver="docker")
	I1124 08:43:56.480043 1655502 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1124 08:43:56.480115 1655502 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1124 08:43:56.480159 1655502 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-674149
	I1124 08:43:56.496798 1655502 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34664 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/addons-674149/id_rsa Username:docker}
	I1124 08:43:56.602935 1655502 ssh_runner.go:195] Run: cat /etc/os-release
	I1124 08:43:56.606302 1655502 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1124 08:43:56.606334 1655502 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1124 08:43:56.606346 1655502 filesync.go:126] Scanning /home/jenkins/minikube-integration/21978-1652607/.minikube/addons for local assets ...
	I1124 08:43:56.606429 1655502 filesync.go:126] Scanning /home/jenkins/minikube-integration/21978-1652607/.minikube/files for local assets ...
	I1124 08:43:56.606484 1655502 start.go:296] duration metric: took 126.438472ms for postStartSetup
	I1124 08:43:56.606837 1655502 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" addons-674149
	I1124 08:43:56.623427 1655502 profile.go:143] Saving config to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/config.json ...
	I1124 08:43:56.623711 1655502 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1124 08:43:56.623759 1655502 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-674149
	I1124 08:43:56.640273 1655502 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34664 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/addons-674149/id_rsa Username:docker}
	I1124 08:43:56.743042 1655502 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1124 08:43:56.747301 1655502 start.go:128] duration metric: took 12.389436025s to createHost
	I1124 08:43:56.747325 1655502 start.go:83] releasing machines lock for "addons-674149", held for 12.389574423s
	I1124 08:43:56.747395 1655502 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" addons-674149
	I1124 08:43:56.763675 1655502 ssh_runner.go:195] Run: cat /version.json
	I1124 08:43:56.763725 1655502 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1124 08:43:56.763732 1655502 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-674149
	I1124 08:43:56.763797 1655502 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-674149
	I1124 08:43:56.783631 1655502 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34664 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/addons-674149/id_rsa Username:docker}
	I1124 08:43:56.787840 1655502 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34664 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/addons-674149/id_rsa Username:docker}
	I1124 08:43:56.970404 1655502 ssh_runner.go:195] Run: systemctl --version
	I1124 08:43:56.976668 1655502 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1124 08:43:56.980782 1655502 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1124 08:43:56.980857 1655502 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1124 08:43:57.008373 1655502 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist, /etc/cni/net.d/10-crio-bridge.conflist.disabled] bridge cni config(s)
	I1124 08:43:57.008401 1655502 start.go:496] detecting cgroup driver to use...
	I1124 08:43:57.008438 1655502 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1124 08:43:57.008495 1655502 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1124 08:43:57.023673 1655502 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1124 08:43:57.036415 1655502 docker.go:218] disabling cri-docker service (if available) ...
	I1124 08:43:57.036485 1655502 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1124 08:43:57.054161 1655502 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1124 08:43:57.072156 1655502 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1124 08:43:57.186628 1655502 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1124 08:43:57.312998 1655502 docker.go:234] disabling docker service ...
	I1124 08:43:57.313075 1655502 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1124 08:43:57.334227 1655502 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1124 08:43:57.347318 1655502 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1124 08:43:57.471243 1655502 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1124 08:43:57.595938 1655502 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1124 08:43:57.608392 1655502 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1124 08:43:57.622033 1655502 download.go:108] Downloading: https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm.sha256 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/linux/arm64/v1.34.2/kubeadm
	I1124 08:43:58.493327 1655502 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1124 08:43:58.503118 1655502 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1124 08:43:58.512245 1655502 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1124 08:43:58.512322 1655502 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1124 08:43:58.521446 1655502 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1124 08:43:58.530306 1655502 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1124 08:43:58.539179 1655502 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1124 08:43:58.548173 1655502 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1124 08:43:58.556560 1655502 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1124 08:43:58.565205 1655502 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1124 08:43:58.573807 1655502 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1124 08:43:58.582438 1655502 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1124 08:43:58.590313 1655502 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1124 08:43:58.598281 1655502 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1124 08:43:58.709744 1655502 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1124 08:43:58.828600 1655502 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1124 08:43:58.828682 1655502 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1124 08:43:58.832303 1655502 start.go:564] Will wait 60s for crictl version
	I1124 08:43:58.832382 1655502 ssh_runner.go:195] Run: which crictl
	I1124 08:43:58.835885 1655502 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1124 08:43:58.860027 1655502 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.1.5
	RuntimeApiVersion:  v1
	I1124 08:43:58.860109 1655502 ssh_runner.go:195] Run: containerd --version
	I1124 08:43:58.880211 1655502 ssh_runner.go:195] Run: containerd --version
	I1124 08:43:58.907185 1655502 out.go:179] * Preparing Kubernetes v1.34.2 on containerd 2.1.5 ...
	I1124 08:43:58.910196 1655502 cli_runner.go:164] Run: docker network inspect addons-674149 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1124 08:43:58.925664 1655502 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1124 08:43:58.929344 1655502 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.49.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1124 08:43:58.938861 1655502 kubeadm.go:884] updating cluster {Name:addons-674149 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.2 ClusterName:addons-674149 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNa
mes:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8443 KubernetesVersion:v1.34.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwareP
ath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1124 08:43:58.939042 1655502 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm.sha256
	I1124 08:43:59.089422 1655502 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm.sha256
	I1124 08:43:59.239996 1655502 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm.sha256
	I1124 08:43:59.389388 1655502 preload.go:188] Checking if preload exists for k8s version v1.34.2 and runtime containerd
	I1124 08:43:59.389534 1655502 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm.sha256
	I1124 08:43:59.536747 1655502 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm.sha256
	I1124 08:43:59.684798 1655502 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm.sha256
	I1124 08:43:59.851172 1655502 ssh_runner.go:195] Run: sudo crictl images --output json
	I1124 08:43:59.878056 1655502 containerd.go:627] all images are preloaded for containerd runtime.
	I1124 08:43:59.878084 1655502 containerd.go:534] Images already preloaded, skipping extraction
	I1124 08:43:59.878143 1655502 ssh_runner.go:195] Run: sudo crictl images --output json
	I1124 08:43:59.903880 1655502 containerd.go:627] all images are preloaded for containerd runtime.
	I1124 08:43:59.903905 1655502 cache_images.go:86] Images are preloaded, skipping loading
	I1124 08:43:59.903913 1655502 kubeadm.go:935] updating node { 192.168.49.2 8443 v1.34.2 containerd true true} ...
	I1124 08:43:59.904005 1655502 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.34.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=addons-674149 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.34.2 ClusterName:addons-674149 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1124 08:43:59.904070 1655502 ssh_runner.go:195] Run: sudo crictl info
	I1124 08:43:59.928943 1655502 cni.go:84] Creating CNI manager for ""
	I1124 08:43:59.928968 1655502 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1124 08:43:59.928987 1655502 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1124 08:43:59.929009 1655502 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8443 KubernetesVersion:v1.34.2 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:addons-674149 NodeName:addons-674149 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc
/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1124 08:43:59.929130 1655502 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "addons-674149"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.34.2
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1124 08:43:59.929206 1655502 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.34.2
	I1124 08:43:59.936731 1655502 binaries.go:51] Found k8s binaries, skipping transfer
	I1124 08:43:59.936822 1655502 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1124 08:43:59.944239 1655502 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (317 bytes)
	I1124 08:43:59.956763 1655502 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I1124 08:43:59.969143 1655502 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2226 bytes)
	I1124 08:43:59.981299 1655502 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1124 08:43:59.984935 1655502 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.49.2	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1124 08:43:59.993988 1655502 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1124 08:44:00.194221 1655502 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1124 08:44:00.226995 1655502 certs.go:69] Setting up /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149 for IP: 192.168.49.2
	I1124 08:44:00.227020 1655502 certs.go:195] generating shared ca certs ...
	I1124 08:44:00.227040 1655502 certs.go:227] acquiring lock for ca certs: {Name:mkbe540a30c4376a351176f7fe6fec044d058b09 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 08:44:00.227192 1655502 certs.go:241] generating "minikubeCA" ca cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.key
	I1124 08:44:00.371060 1655502 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt ...
	I1124 08:44:00.372841 1655502 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt: {Name:mka8e3e47890b99276527a0247c1667cb020f571 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 08:44:00.378289 1655502 crypto.go:164] Writing key to /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.key ...
	I1124 08:44:00.378319 1655502 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.key: {Name:mk9382756fe0d93d081d4c3964ee102aa3eb6923 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 08:44:00.378499 1655502 certs.go:241] generating "proxyClientCA" ca cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.key
	I1124 08:44:00.738197 1655502 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.crt ...
	I1124 08:44:00.738237 1655502 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.crt: {Name:mka6ab5bf796a0aac1b0077bf587c8a2dacf1a0b Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 08:44:00.738510 1655502 crypto.go:164] Writing key to /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.key ...
	I1124 08:44:00.738539 1655502 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.key: {Name:mkd017df5c87ad40eafef0d555effb049cb54236 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 08:44:00.738804 1655502 certs.go:257] generating profile certs ...
	I1124 08:44:00.738897 1655502 certs.go:364] generating signed profile cert for "minikube-user": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.key
	I1124 08:44:00.738926 1655502 crypto.go:68] Generating cert /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt with IP's: []
	I1124 08:44:01.090800 1655502 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt ...
	I1124 08:44:01.090838 1655502 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt: {Name:mk9638d303c1e822a2feed2a05dad3a11447bd13 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 08:44:01.091045 1655502 crypto.go:164] Writing key to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.key ...
	I1124 08:44:01.091061 1655502 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.key: {Name:mkdd04d6e21e44ccefacf67234d0c3144ff70f44 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 08:44:01.091147 1655502 certs.go:364] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/apiserver.key.45330729
	I1124 08:44:01.091168 1655502 crypto.go:68] Generating cert /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/apiserver.crt.45330729 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.49.2]
	I1124 08:44:01.136970 1655502 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/apiserver.crt.45330729 ...
	I1124 08:44:01.137003 1655502 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/apiserver.crt.45330729: {Name:mk49b619587227118399d7e8566bf96cadb39eb0 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 08:44:01.137192 1655502 crypto.go:164] Writing key to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/apiserver.key.45330729 ...
	I1124 08:44:01.137208 1655502 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/apiserver.key.45330729: {Name:mk7274430f0c4bed4cc067327ca180dbad8e55b7 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 08:44:01.137286 1655502 certs.go:382] copying /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/apiserver.crt.45330729 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/apiserver.crt
	I1124 08:44:01.137362 1655502 certs.go:386] copying /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/apiserver.key.45330729 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/apiserver.key
	I1124 08:44:01.137421 1655502 certs.go:364] generating signed profile cert for "aggregator": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/proxy-client.key
	I1124 08:44:01.137448 1655502 crypto.go:68] Generating cert /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/proxy-client.crt with IP's: []
	I1124 08:44:01.851632 1655502 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/proxy-client.crt ...
	I1124 08:44:01.851665 1655502 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/proxy-client.crt: {Name:mk82144a343570d5a32189ef0d12547a7f60cd9c Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 08:44:01.851871 1655502 crypto.go:164] Writing key to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/proxy-client.key ...
	I1124 08:44:01.851886 1655502 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/proxy-client.key: {Name:mk58d1933148cb2110c225b3f632c3e3a5555046 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 08:44:01.852100 1655502 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem (1671 bytes)
	I1124 08:44:01.852148 1655502 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem (1078 bytes)
	I1124 08:44:01.852180 1655502 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem (1123 bytes)
	I1124 08:44:01.852210 1655502 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem (1679 bytes)
	I1124 08:44:01.852778 1655502 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1124 08:44:01.869974 1655502 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1124 08:44:01.896981 1655502 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1124 08:44:01.917432 1655502 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1124 08:44:01.940406 1655502 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1419 bytes)
	I1124 08:44:01.958577 1655502 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1124 08:44:01.976491 1655502 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1124 08:44:01.994084 1655502 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1124 08:44:02.014100 1655502 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1124 08:44:02.034385 1655502 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1124 08:44:02.049171 1655502 ssh_runner.go:195] Run: openssl version
	I1124 08:44:02.055832 1655502 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1124 08:44:02.064756 1655502 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1124 08:44:02.068422 1655502 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Nov 24 08:44 /usr/share/ca-certificates/minikubeCA.pem
	I1124 08:44:02.068504 1655502 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1124 08:44:02.109368 1655502 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1124 08:44:02.117451 1655502 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1124 08:44:02.120909 1655502 certs.go:400] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I1124 08:44:02.120975 1655502 kubeadm.go:401] StartCluster: {Name:addons-674149 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.2 ClusterName:addons-674149 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames
:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8443 KubernetesVersion:v1.34.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath
: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 08:44:02.121056 1655502 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1124 08:44:02.121114 1655502 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1124 08:44:02.147196 1655502 cri.go:89] found id: ""
	I1124 08:44:02.147304 1655502 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1124 08:44:02.154845 1655502 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1124 08:44:02.162245 1655502 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1124 08:44:02.162354 1655502 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1124 08:44:02.170189 1655502 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1124 08:44:02.170207 1655502 kubeadm.go:158] found existing configuration files:
	
	I1124 08:44:02.170260 1655502 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I1124 08:44:02.178091 1655502 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1124 08:44:02.178158 1655502 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1124 08:44:02.185991 1655502 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I1124 08:44:02.193723 1655502 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1124 08:44:02.193818 1655502 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1124 08:44:02.201804 1655502 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I1124 08:44:02.209409 1655502 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1124 08:44:02.209501 1655502 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1124 08:44:02.216782 1655502 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I1124 08:44:02.224475 1655502 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1124 08:44:02.224543 1655502 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1124 08:44:02.232185 1655502 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.34.2:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1124 08:44:02.273442 1655502 kubeadm.go:319] [init] Using Kubernetes version: v1.34.2
	I1124 08:44:02.273503 1655502 kubeadm.go:319] [preflight] Running pre-flight checks
	I1124 08:44:02.294755 1655502 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1124 08:44:02.294833 1655502 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1124 08:44:02.294871 1655502 kubeadm.go:319] OS: Linux
	I1124 08:44:02.294922 1655502 kubeadm.go:319] CGROUPS_CPU: enabled
	I1124 08:44:02.294975 1655502 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1124 08:44:02.295028 1655502 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1124 08:44:02.295081 1655502 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1124 08:44:02.295132 1655502 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1124 08:44:02.295201 1655502 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1124 08:44:02.295252 1655502 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1124 08:44:02.295305 1655502 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1124 08:44:02.295356 1655502 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1124 08:44:02.367265 1655502 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1124 08:44:02.367378 1655502 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1124 08:44:02.367471 1655502 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1124 08:44:02.372970 1655502 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1124 08:44:02.379198 1655502 out.go:252]   - Generating certificates and keys ...
	I1124 08:44:02.379339 1655502 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1124 08:44:02.379426 1655502 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1124 08:44:02.661261 1655502 kubeadm.go:319] [certs] Generating "apiserver-kubelet-client" certificate and key
	I1124 08:44:02.840828 1655502 kubeadm.go:319] [certs] Generating "front-proxy-ca" certificate and key
	I1124 08:44:03.374097 1655502 kubeadm.go:319] [certs] Generating "front-proxy-client" certificate and key
	I1124 08:44:03.554529 1655502 kubeadm.go:319] [certs] Generating "etcd/ca" certificate and key
	I1124 08:44:04.071031 1655502 kubeadm.go:319] [certs] Generating "etcd/server" certificate and key
	I1124 08:44:04.071171 1655502 kubeadm.go:319] [certs] etcd/server serving cert is signed for DNS names [addons-674149 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	I1124 08:44:05.036563 1655502 kubeadm.go:319] [certs] Generating "etcd/peer" certificate and key
	I1124 08:44:05.036706 1655502 kubeadm.go:319] [certs] etcd/peer serving cert is signed for DNS names [addons-674149 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	I1124 08:44:05.373272 1655502 kubeadm.go:319] [certs] Generating "etcd/healthcheck-client" certificate and key
	I1124 08:44:05.834485 1655502 kubeadm.go:319] [certs] Generating "apiserver-etcd-client" certificate and key
	I1124 08:44:06.884360 1655502 kubeadm.go:319] [certs] Generating "sa" key and public key
	I1124 08:44:06.884623 1655502 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1124 08:44:07.453982 1655502 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1124 08:44:08.373748 1655502 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1124 08:44:09.333130 1655502 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1124 08:44:09.691918 1655502 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1124 08:44:10.248631 1655502 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1124 08:44:10.249365 1655502 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1124 08:44:10.252250 1655502 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1124 08:44:10.255635 1655502 out.go:252]   - Booting up control plane ...
	I1124 08:44:10.255747 1655502 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1124 08:44:10.255833 1655502 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1124 08:44:10.256545 1655502 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1124 08:44:10.272405 1655502 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1124 08:44:10.272738 1655502 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1124 08:44:10.279942 1655502 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1124 08:44:10.282535 1655502 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1124 08:44:10.282809 1655502 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1124 08:44:10.412971 1655502 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1124 08:44:10.413093 1655502 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1124 08:44:11.913330 1655502 kubeadm.go:319] [kubelet-check] The kubelet is healthy after 1.500696798s
	I1124 08:44:11.917211 1655502 kubeadm.go:319] [control-plane-check] Waiting for healthy control plane components. This can take up to 4m0s
	I1124 08:44:11.917307 1655502 kubeadm.go:319] [control-plane-check] Checking kube-apiserver at https://192.168.49.2:8443/livez
	I1124 08:44:11.917424 1655502 kubeadm.go:319] [control-plane-check] Checking kube-controller-manager at https://127.0.0.1:10257/healthz
	I1124 08:44:11.917503 1655502 kubeadm.go:319] [control-plane-check] Checking kube-scheduler at https://127.0.0.1:10259/livez
	I1124 08:44:15.702064 1655502 kubeadm.go:319] [control-plane-check] kube-controller-manager is healthy after 3.784087975s
	I1124 08:44:16.542197 1655502 kubeadm.go:319] [control-plane-check] kube-scheduler is healthy after 4.624893411s
	I1124 08:44:18.419071 1655502 kubeadm.go:319] [control-plane-check] kube-apiserver is healthy after 6.501492822s
	I1124 08:44:18.450903 1655502 kubeadm.go:319] [upload-config] Storing the configuration used in ConfigMap "kubeadm-config" in the "kube-system" Namespace
	I1124 08:44:18.465541 1655502 kubeadm.go:319] [kubelet] Creating a ConfigMap "kubelet-config" in namespace kube-system with the configuration for the kubelets in the cluster
	I1124 08:44:18.479660 1655502 kubeadm.go:319] [upload-certs] Skipping phase. Please see --upload-certs
	I1124 08:44:18.479901 1655502 kubeadm.go:319] [mark-control-plane] Marking the node addons-674149 as control-plane by adding the labels: [node-role.kubernetes.io/control-plane node.kubernetes.io/exclude-from-external-load-balancers]
	I1124 08:44:18.492516 1655502 kubeadm.go:319] [bootstrap-token] Using token: xzsuk9.72pqdproti8guzpn
	I1124 08:44:18.495500 1655502 out.go:252]   - Configuring RBAC rules ...
	I1124 08:44:18.495638 1655502 kubeadm.go:319] [bootstrap-token] Configuring bootstrap tokens, cluster-info ConfigMap, RBAC Roles
	I1124 08:44:18.499713 1655502 kubeadm.go:319] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to get nodes
	I1124 08:44:18.512318 1655502 kubeadm.go:319] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to post CSRs in order for nodes to get long term certificate credentials
	I1124 08:44:18.517607 1655502 kubeadm.go:319] [bootstrap-token] Configured RBAC rules to allow the csrapprover controller automatically approve CSRs from a Node Bootstrap Token
	I1124 08:44:18.524482 1655502 kubeadm.go:319] [bootstrap-token] Configured RBAC rules to allow certificate rotation for all node client certificates in the cluster
	I1124 08:44:18.529681 1655502 kubeadm.go:319] [bootstrap-token] Creating the "cluster-info" ConfigMap in the "kube-public" namespace
	I1124 08:44:18.826354 1655502 kubeadm.go:319] [kubelet-finalize] Updating "/etc/kubernetes/kubelet.conf" to point to a rotatable kubelet client certificate and key
	I1124 08:44:19.252583 1655502 kubeadm.go:319] [addons] Applied essential addon: CoreDNS
	I1124 08:44:19.828252 1655502 kubeadm.go:319] [addons] Applied essential addon: kube-proxy
	I1124 08:44:19.829455 1655502 kubeadm.go:319] 
	I1124 08:44:19.829530 1655502 kubeadm.go:319] Your Kubernetes control-plane has initialized successfully!
	I1124 08:44:19.829541 1655502 kubeadm.go:319] 
	I1124 08:44:19.829618 1655502 kubeadm.go:319] To start using your cluster, you need to run the following as a regular user:
	I1124 08:44:19.829626 1655502 kubeadm.go:319] 
	I1124 08:44:19.829651 1655502 kubeadm.go:319]   mkdir -p $HOME/.kube
	I1124 08:44:19.829721 1655502 kubeadm.go:319]   sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config
	I1124 08:44:19.829803 1655502 kubeadm.go:319]   sudo chown $(id -u):$(id -g) $HOME/.kube/config
	I1124 08:44:19.829816 1655502 kubeadm.go:319] 
	I1124 08:44:19.829908 1655502 kubeadm.go:319] Alternatively, if you are the root user, you can run:
	I1124 08:44:19.829916 1655502 kubeadm.go:319] 
	I1124 08:44:19.829970 1655502 kubeadm.go:319]   export KUBECONFIG=/etc/kubernetes/admin.conf
	I1124 08:44:19.829981 1655502 kubeadm.go:319] 
	I1124 08:44:19.830058 1655502 kubeadm.go:319] You should now deploy a pod network to the cluster.
	I1124 08:44:19.830155 1655502 kubeadm.go:319] Run "kubectl apply -f [podnetwork].yaml" with one of the options listed at:
	I1124 08:44:19.830244 1655502 kubeadm.go:319]   https://kubernetes.io/docs/concepts/cluster-administration/addons/
	I1124 08:44:19.830250 1655502 kubeadm.go:319] 
	I1124 08:44:19.830346 1655502 kubeadm.go:319] You can now join any number of control-plane nodes by copying certificate authorities
	I1124 08:44:19.830431 1655502 kubeadm.go:319] and service account keys on each node and then running the following as root:
	I1124 08:44:19.830436 1655502 kubeadm.go:319] 
	I1124 08:44:19.830554 1655502 kubeadm.go:319]   kubeadm join control-plane.minikube.internal:8443 --token xzsuk9.72pqdproti8guzpn \
	I1124 08:44:19.830687 1655502 kubeadm.go:319] 	--discovery-token-ca-cert-hash sha256:d55a4d7f583a51ce0fd49715bdb144dc7e07b5286773075ca535e70f191df377 \
	I1124 08:44:19.830763 1655502 kubeadm.go:319] 	--control-plane 
	I1124 08:44:19.830777 1655502 kubeadm.go:319] 
	I1124 08:44:19.830863 1655502 kubeadm.go:319] Then you can join any number of worker nodes by running the following on each as root:
	I1124 08:44:19.830879 1655502 kubeadm.go:319] 
	I1124 08:44:19.830962 1655502 kubeadm.go:319] kubeadm join control-plane.minikube.internal:8443 --token xzsuk9.72pqdproti8guzpn \
	I1124 08:44:19.831069 1655502 kubeadm.go:319] 	--discovery-token-ca-cert-hash sha256:d55a4d7f583a51ce0fd49715bdb144dc7e07b5286773075ca535e70f191df377 
	I1124 08:44:19.834239 1655502 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is in maintenance mode, please migrate to cgroups v2
	I1124 08:44:19.834512 1655502 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1124 08:44:19.834652 1655502 kubeadm.go:319] 	[WARNING Service-Kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1124 08:44:19.834683 1655502 cni.go:84] Creating CNI manager for ""
	I1124 08:44:19.834697 1655502 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1124 08:44:19.837794 1655502 out.go:179] * Configuring CNI (Container Networking Interface) ...
	I1124 08:44:19.840636 1655502 ssh_runner.go:195] Run: stat /opt/cni/bin/portmap
	I1124 08:44:19.844674 1655502 cni.go:182] applying CNI manifest using /var/lib/minikube/binaries/v1.34.2/kubectl ...
	I1124 08:44:19.844694 1655502 ssh_runner.go:362] scp memory --> /var/tmp/minikube/cni.yaml (2601 bytes)
	I1124 08:44:19.858739 1655502 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.2/kubectl apply --kubeconfig=/var/lib/minikube/kubeconfig -f /var/tmp/minikube/cni.yaml
	I1124 08:44:20.193420 1655502 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I1124 08:44:20.193550 1655502 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes addons-674149 minikube.k8s.io/updated_at=2025_11_24T08_44_20_0700 minikube.k8s.io/version=v1.37.0 minikube.k8s.io/commit=393ee3e0b845623107dce6cda4f48ffd5c3d1811 minikube.k8s.io/name=addons-674149 minikube.k8s.io/primary=true
	I1124 08:44:20.193582 1655502 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.2/kubectl create clusterrolebinding minikube-rbac --clusterrole=cluster-admin --serviceaccount=kube-system:default --kubeconfig=/var/lib/minikube/kubeconfig
	I1124 08:44:20.208272 1655502 ops.go:34] apiserver oom_adj: -16
	I1124 08:44:20.396176 1655502 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1124 08:44:20.896270 1655502 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1124 08:44:21.396210 1655502 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1124 08:44:21.896276 1655502 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1124 08:44:22.396906 1655502 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1124 08:44:22.896299 1655502 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1124 08:44:23.396300 1655502 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1124 08:44:23.896221 1655502 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1124 08:44:24.396312 1655502 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1124 08:44:24.506210 1655502 kubeadm.go:1114] duration metric: took 4.31278144s to wait for elevateKubeSystemPrivileges
	I1124 08:44:24.506241 1655502 kubeadm.go:403] duration metric: took 22.385285102s to StartCluster
	I1124 08:44:24.506269 1655502 settings.go:142] acquiring lock: {Name:mk6c04793f5fd4f38f92abf4357247f2ccd7fc4e Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 08:44:24.506382 1655502 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 08:44:24.506804 1655502 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/kubeconfig: {Name:mk02121ae6148bede61eabf0ed4e1826024715f4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 08:44:24.507016 1655502 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.49.2 Port:8443 KubernetesVersion:v1.34.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1124 08:44:24.507132 1655502 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.34.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml"
	I1124 08:44:24.507382 1655502 config.go:182] Loaded profile config "addons-674149": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
	I1124 08:44:24.507421 1655502 addons.go:527] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:true auto-pause:false cloud-spanner:true csi-hostpath-driver:true dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:true gvisor:false headlamp:false inaccel:false ingress:true ingress-dns:true inspektor-gadget:true istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:true nvidia-device-plugin:true nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:true registry-aliases:false registry-creds:true storage-provisioner:true storage-provisioner-rancher:true volcano:true volumesnapshots:true yakd:true]
	I1124 08:44:24.507506 1655502 addons.go:70] Setting yakd=true in profile "addons-674149"
	I1124 08:44:24.507525 1655502 addons.go:239] Setting addon yakd=true in "addons-674149"
	I1124 08:44:24.507555 1655502 host.go:66] Checking if "addons-674149" exists ...
	I1124 08:44:24.508076 1655502 cli_runner.go:164] Run: docker container inspect addons-674149 --format={{.State.Status}}
	I1124 08:44:24.508622 1655502 addons.go:70] Setting inspektor-gadget=true in profile "addons-674149"
	I1124 08:44:24.508642 1655502 addons.go:239] Setting addon inspektor-gadget=true in "addons-674149"
	I1124 08:44:24.508666 1655502 host.go:66] Checking if "addons-674149" exists ...
	I1124 08:44:24.509099 1655502 cli_runner.go:164] Run: docker container inspect addons-674149 --format={{.State.Status}}
	I1124 08:44:24.509351 1655502 addons.go:70] Setting metrics-server=true in profile "addons-674149"
	I1124 08:44:24.509384 1655502 addons.go:239] Setting addon metrics-server=true in "addons-674149"
	I1124 08:44:24.509426 1655502 host.go:66] Checking if "addons-674149" exists ...
	I1124 08:44:24.509639 1655502 addons.go:70] Setting amd-gpu-device-plugin=true in profile "addons-674149"
	I1124 08:44:24.509664 1655502 addons.go:239] Setting addon amd-gpu-device-plugin=true in "addons-674149"
	I1124 08:44:24.509685 1655502 host.go:66] Checking if "addons-674149" exists ...
	I1124 08:44:24.510025 1655502 cli_runner.go:164] Run: docker container inspect addons-674149 --format={{.State.Status}}
	I1124 08:44:24.510115 1655502 cli_runner.go:164] Run: docker container inspect addons-674149 --format={{.State.Status}}
	I1124 08:44:24.511650 1655502 addons.go:70] Setting nvidia-device-plugin=true in profile "addons-674149"
	I1124 08:44:24.511695 1655502 addons.go:239] Setting addon nvidia-device-plugin=true in "addons-674149"
	I1124 08:44:24.511727 1655502 host.go:66] Checking if "addons-674149" exists ...
	I1124 08:44:24.512183 1655502 cli_runner.go:164] Run: docker container inspect addons-674149 --format={{.State.Status}}
	I1124 08:44:24.512712 1655502 addons.go:70] Setting registry=true in profile "addons-674149"
	I1124 08:44:24.512741 1655502 addons.go:239] Setting addon registry=true in "addons-674149"
	I1124 08:44:24.512767 1655502 host.go:66] Checking if "addons-674149" exists ...
	I1124 08:44:24.513192 1655502 cli_runner.go:164] Run: docker container inspect addons-674149 --format={{.State.Status}}
	I1124 08:44:24.522803 1655502 addons.go:70] Setting registry-creds=true in profile "addons-674149"
	I1124 08:44:24.522836 1655502 addons.go:239] Setting addon registry-creds=true in "addons-674149"
	I1124 08:44:24.522913 1655502 host.go:66] Checking if "addons-674149" exists ...
	I1124 08:44:24.523860 1655502 addons.go:70] Setting storage-provisioner=true in profile "addons-674149"
	I1124 08:44:24.523899 1655502 addons.go:239] Setting addon storage-provisioner=true in "addons-674149"
	I1124 08:44:24.523939 1655502 host.go:66] Checking if "addons-674149" exists ...
	I1124 08:44:24.524416 1655502 cli_runner.go:164] Run: docker container inspect addons-674149 --format={{.State.Status}}
	I1124 08:44:24.530720 1655502 addons.go:70] Setting storage-provisioner-rancher=true in profile "addons-674149"
	I1124 08:44:24.530768 1655502 addons_storage_classes.go:34] enableOrDisableStorageClasses storage-provisioner-rancher=true on "addons-674149"
	I1124 08:44:24.531147 1655502 cli_runner.go:164] Run: docker container inspect addons-674149 --format={{.State.Status}}
	I1124 08:44:24.537787 1655502 addons.go:70] Setting cloud-spanner=true in profile "addons-674149"
	I1124 08:44:24.538081 1655502 addons.go:239] Setting addon cloud-spanner=true in "addons-674149"
	I1124 08:44:24.538177 1655502 host.go:66] Checking if "addons-674149" exists ...
	I1124 08:44:24.539497 1655502 cli_runner.go:164] Run: docker container inspect addons-674149 --format={{.State.Status}}
	I1124 08:44:24.560713 1655502 addons.go:70] Setting volcano=true in profile "addons-674149"
	I1124 08:44:24.560844 1655502 addons.go:239] Setting addon volcano=true in "addons-674149"
	I1124 08:44:24.561282 1655502 host.go:66] Checking if "addons-674149" exists ...
	I1124 08:44:24.568802 1655502 cli_runner.go:164] Run: docker container inspect addons-674149 --format={{.State.Status}}
	I1124 08:44:24.572671 1655502 addons.go:70] Setting csi-hostpath-driver=true in profile "addons-674149"
	I1124 08:44:24.572800 1655502 addons.go:239] Setting addon csi-hostpath-driver=true in "addons-674149"
	I1124 08:44:24.572874 1655502 host.go:66] Checking if "addons-674149" exists ...
	I1124 08:44:24.573553 1655502 cli_runner.go:164] Run: docker container inspect addons-674149 --format={{.State.Status}}
	I1124 08:44:24.589948 1655502 addons.go:70] Setting volumesnapshots=true in profile "addons-674149"
	I1124 08:44:24.590148 1655502 addons.go:239] Setting addon volumesnapshots=true in "addons-674149"
	I1124 08:44:24.590222 1655502 host.go:66] Checking if "addons-674149" exists ...
	I1124 08:44:24.591025 1655502 cli_runner.go:164] Run: docker container inspect addons-674149 --format={{.State.Status}}
	I1124 08:44:24.595848 1655502 addons.go:70] Setting default-storageclass=true in profile "addons-674149"
	I1124 08:44:24.595961 1655502 addons_storage_classes.go:34] enableOrDisableStorageClasses default-storageclass=true on "addons-674149"
	I1124 08:44:24.596460 1655502 cli_runner.go:164] Run: docker container inspect addons-674149 --format={{.State.Status}}
	I1124 08:44:24.631788 1655502 out.go:179] * Verifying Kubernetes components...
	I1124 08:44:24.635132 1655502 addons.go:70] Setting gcp-auth=true in profile "addons-674149"
	I1124 08:44:24.635171 1655502 mustload.go:66] Loading cluster: addons-674149
	I1124 08:44:24.635310 1655502 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1124 08:44:24.635966 1655502 config.go:182] Loaded profile config "addons-674149": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
	I1124 08:44:24.636448 1655502 cli_runner.go:164] Run: docker container inspect addons-674149 --format={{.State.Status}}
	I1124 08:44:24.669012 1655502 addons.go:70] Setting ingress=true in profile "addons-674149"
	I1124 08:44:24.669108 1655502 addons.go:239] Setting addon ingress=true in "addons-674149"
	I1124 08:44:24.669355 1655502 host.go:66] Checking if "addons-674149" exists ...
	I1124 08:44:24.699093 1655502 addons.go:70] Setting ingress-dns=true in profile "addons-674149"
	I1124 08:44:24.699186 1655502 addons.go:239] Setting addon ingress-dns=true in "addons-674149"
	I1124 08:44:24.699266 1655502 host.go:66] Checking if "addons-674149" exists ...
	I1124 08:44:24.699917 1655502 cli_runner.go:164] Run: docker container inspect addons-674149 --format={{.State.Status}}
	I1124 08:44:24.713442 1655502 cli_runner.go:164] Run: docker container inspect addons-674149 --format={{.State.Status}}
	I1124 08:44:24.732540 1655502 out.go:179]   - Using image gcr.io/k8s-minikube/kube-registry-proxy:0.0.9
	I1124 08:44:24.752928 1655502 out.go:179]   - Using image docker.io/rocm/k8s-device-plugin:1.25.2.8
	I1124 08:44:24.808342 1655502 out.go:179]   - Using image docker.io/registry:3.0.0
	I1124 08:44:24.808552 1655502 out.go:179]   - Using image nvcr.io/nvidia/k8s-device-plugin:v0.18.0
	I1124 08:44:24.808886 1655502 cli_runner.go:164] Run: docker container inspect addons-674149 --format={{.State.Status}}
	I1124 08:44:24.821877 1655502 out.go:179]   - Using image docker.io/marcnuri/yakd:0.0.5
	I1124 08:44:24.837775 1655502 addons.go:436] installing /etc/kubernetes/addons/amd-gpu-device-plugin.yaml
	I1124 08:44:24.850060 1655502 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/amd-gpu-device-plugin.yaml (1868 bytes)
	I1124 08:44:24.850130 1655502 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-674149
	I1124 08:44:24.837975 1655502 addons.go:436] installing /etc/kubernetes/addons/nvidia-device-plugin.yaml
	I1124 08:44:24.856353 1655502 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/nvidia-device-plugin.yaml (1966 bytes)
	I1124 08:44:24.856520 1655502 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-674149
	I1124 08:44:24.838006 1655502 addons.go:436] installing /etc/kubernetes/addons/registry-rc.yaml
	I1124 08:44:24.868499 1655502 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/registry-rc.yaml (860 bytes)
	I1124 08:44:24.868612 1655502 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-674149
	I1124 08:44:24.838220 1655502 out.go:179]   - Using image registry.k8s.io/metrics-server/metrics-server:v0.8.0
	I1124 08:44:24.870296 1655502 out.go:179]   - Using image registry.k8s.io/sig-storage/snapshot-controller:v6.1.0
	I1124 08:44:24.838306 1655502 out.go:179]   - Using image ghcr.io/inspektor-gadget/inspektor-gadget:v0.46.0
	I1124 08:44:24.838311 1655502 out.go:179]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1124 08:44:24.838326 1655502 addons.go:436] installing /etc/kubernetes/addons/yakd-ns.yaml
	I1124 08:44:24.838363 1655502 host.go:66] Checking if "addons-674149" exists ...
	I1124 08:44:24.847577 1655502 addons.go:239] Setting addon storage-provisioner-rancher=true in "addons-674149"
	I1124 08:44:24.850037 1655502 addons.go:239] Setting addon default-storageclass=true in "addons-674149"
	I1124 08:44:24.871305 1655502 host.go:66] Checking if "addons-674149" exists ...
	I1124 08:44:24.871753 1655502 cli_runner.go:164] Run: docker container inspect addons-674149 --format={{.State.Status}}
	I1124 08:44:24.873824 1655502 host.go:66] Checking if "addons-674149" exists ...
	I1124 08:44:24.874278 1655502 cli_runner.go:164] Run: docker container inspect addons-674149 --format={{.State.Status}}
	I1124 08:44:24.888828 1655502 ssh_runner.go:362] scp yakd/yakd-ns.yaml --> /etc/kubernetes/addons/yakd-ns.yaml (171 bytes)
	I1124 08:44:24.888918 1655502 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-674149
	I1124 08:44:24.893225 1655502 out.go:179]   - Using image docker.io/volcanosh/vc-controller-manager:v1.13.0
	I1124 08:44:24.893855 1655502 addons.go:436] installing /etc/kubernetes/addons/ig-deployment.yaml
	I1124 08:44:24.893881 1655502 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/ig-deployment.yaml (15034 bytes)
	I1124 08:44:24.893951 1655502 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-674149
	I1124 08:44:24.896515 1655502 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 08:44:24.896533 1655502 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1124 08:44:24.896585 1655502 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-674149
	I1124 08:44:24.910387 1655502 addons.go:436] installing /etc/kubernetes/addons/metrics-apiservice.yaml
	I1124 08:44:24.910550 1655502 ssh_runner.go:362] scp metrics-server/metrics-apiservice.yaml --> /etc/kubernetes/addons/metrics-apiservice.yaml (424 bytes)
	I1124 08:44:24.910642 1655502 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-674149
	I1124 08:44:24.925118 1655502 out.go:179]   - Using image registry.k8s.io/sig-storage/csi-external-health-monitor-controller:v0.7.0
	I1124 08:44:24.925395 1655502 out.go:179]   - Using image gcr.io/cloud-spanner-emulator/emulator:1.5.45
	I1124 08:44:24.926702 1655502 addons.go:436] installing /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml
	I1124 08:44:24.926720 1655502 ssh_runner.go:362] scp volumesnapshots/csi-hostpath-snapshotclass.yaml --> /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml (934 bytes)
	I1124 08:44:24.926966 1655502 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-674149
	I1124 08:44:24.938710 1655502 addons.go:436] installing /etc/kubernetes/addons/deployment.yaml
	I1124 08:44:24.947730 1655502 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/deployment.yaml (1004 bytes)
	I1124 08:44:24.947806 1655502 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-674149
	I1124 08:44:24.955108 1655502 out.go:179]   - Using image docker.io/volcanosh/vc-scheduler:v1.13.0
	I1124 08:44:24.982949 1655502 out.go:179]   - Using image docker.io/kicbase/minikube-ingress-dns:0.0.4
	I1124 08:44:24.983167 1655502 out.go:179]   - Using image docker.io/upmcenterprises/registry-creds:1.10
	I1124 08:44:25.041936 1655502 addons.go:436] installing /etc/kubernetes/addons/ingress-dns-pod.yaml
	I1124 08:44:25.042004 1655502 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/ingress-dns-pod.yaml (2889 bytes)
	I1124 08:44:25.042104 1655502 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-674149
	I1124 08:44:25.042915 1655502 addons.go:436] installing /etc/kubernetes/addons/registry-creds-rc.yaml
	I1124 08:44:25.042968 1655502 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/registry-creds-rc.yaml (3306 bytes)
	I1124 08:44:25.043052 1655502 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-674149
	I1124 08:44:25.065961 1655502 out.go:179]   - Using image registry.k8s.io/sig-storage/csi-node-driver-registrar:v2.6.0
	I1124 08:44:25.076579 1655502 out.go:179]   - Using image registry.k8s.io/sig-storage/hostpathplugin:v1.9.0
	I1124 08:44:25.078566 1655502 out.go:179]   - Using image docker.io/volcanosh/vc-webhook-manager:v1.13.0
	I1124 08:44:25.089314 1655502 addons.go:436] installing /etc/kubernetes/addons/volcano-deployment.yaml
	I1124 08:44:25.089399 1655502 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/volcano-deployment.yaml (1017570 bytes)
	I1124 08:44:25.089502 1655502 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-674149
	I1124 08:44:25.094868 1655502 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34664 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/addons-674149/id_rsa Username:docker}
	I1124 08:44:25.110781 1655502 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34664 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/addons-674149/id_rsa Username:docker}
	I1124 08:44:25.111573 1655502 out.go:179]   - Using image registry.k8s.io/sig-storage/livenessprobe:v2.8.0
	I1124 08:44:25.111656 1655502 out.go:179]   - Using image registry.k8s.io/ingress-nginx/kube-webhook-certgen:v1.6.4
	I1124 08:44:25.117374 1655502 out.go:179]   - Using image docker.io/rancher/local-path-provisioner:v0.0.22
	I1124 08:44:25.122102 1655502 out.go:179]   - Using image registry.k8s.io/sig-storage/csi-resizer:v1.6.0
	I1124 08:44:25.122175 1655502 out.go:179]   - Using image registry.k8s.io/ingress-nginx/kube-webhook-certgen:v1.6.4
	I1124 08:44:25.127927 1655502 out.go:179]   - Using image registry.k8s.io/ingress-nginx/controller:v1.14.0
	I1124 08:44:25.128849 1655502 out.go:179]   - Using image docker.io/busybox:stable
	I1124 08:44:25.129131 1655502 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34664 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/addons-674149/id_rsa Username:docker}
	I1124 08:44:25.129157 1655502 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34664 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/addons-674149/id_rsa Username:docker}
	I1124 08:44:25.132229 1655502 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34664 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/addons-674149/id_rsa Username:docker}
	I1124 08:44:25.133027 1655502 addons.go:436] installing /etc/kubernetes/addons/ingress-deploy.yaml
	I1124 08:44:25.133079 1655502 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/ingress-deploy.yaml (16078 bytes)
	I1124 08:44:25.133237 1655502 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-674149
	I1124 08:44:25.145586 1655502 out.go:179]   - Using image registry.k8s.io/sig-storage/csi-snapshotter:v6.1.0
	I1124 08:44:25.150262 1655502 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner-rancher.yaml
	I1124 08:44:25.150285 1655502 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner-rancher.yaml (3113 bytes)
	I1124 08:44:25.150352 1655502 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-674149
	I1124 08:44:25.177954 1655502 out.go:179]   - Using image registry.k8s.io/sig-storage/csi-provisioner:v3.3.0
	I1124 08:44:25.182617 1655502 out.go:179]   - Using image registry.k8s.io/sig-storage/csi-attacher:v4.0.0
	I1124 08:44:25.186110 1655502 addons.go:436] installing /etc/kubernetes/addons/rbac-external-attacher.yaml
	I1124 08:44:25.186190 1655502 ssh_runner.go:362] scp csi-hostpath-driver/rbac/rbac-external-attacher.yaml --> /etc/kubernetes/addons/rbac-external-attacher.yaml (3073 bytes)
	I1124 08:44:25.186262 1655502 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-674149
	I1124 08:44:25.189981 1655502 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34664 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/addons-674149/id_rsa Username:docker}
	I1124 08:44:25.192923 1655502 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34664 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/addons-674149/id_rsa Username:docker}
	I1124 08:44:25.195305 1655502 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34664 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/addons-674149/id_rsa Username:docker}
	I1124 08:44:25.196270 1655502 addons.go:436] installing /etc/kubernetes/addons/storageclass.yaml
	I1124 08:44:25.196285 1655502 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1124 08:44:25.196351 1655502 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-674149
	I1124 08:44:25.218735 1655502 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34664 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/addons-674149/id_rsa Username:docker}
	I1124 08:44:25.227483 1655502 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34664 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/addons-674149/id_rsa Username:docker}
	I1124 08:44:25.236872 1655502 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34664 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/addons-674149/id_rsa Username:docker}
	I1124 08:44:25.286665 1655502 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34664 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/addons-674149/id_rsa Username:docker}
	I1124 08:44:25.295832 1655502 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34664 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/addons-674149/id_rsa Username:docker}
	I1124 08:44:25.297808 1655502 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34664 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/addons-674149/id_rsa Username:docker}
	I1124 08:44:25.303208 1655502 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34664 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/addons-674149/id_rsa Username:docker}
	I1124 08:44:25.304050 1655502 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34664 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/addons-674149/id_rsa Username:docker}
	W1124 08:44:25.318924 1655502 sshutil.go:64] dial failure (will retry): ssh: handshake failed: EOF
	I1124 08:44:25.318962 1655502 retry.go:31] will retry after 188.641161ms: ssh: handshake failed: EOF
	I1124 08:44:25.395618 1655502 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.34.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml | sed -e '/^        forward . \/etc\/resolv.conf.*/i \        hosts {\n           192.168.49.1 host.minikube.internal\n           fallthrough\n        }' -e '/^        errors *$/i \        log' | sudo /var/lib/minikube/binaries/v1.34.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig replace -f -"
	I1124 08:44:25.395714 1655502 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1124 08:44:25.883728 1655502 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply -f /etc/kubernetes/addons/volcano-deployment.yaml
	I1124 08:44:25.946544 1655502 addons.go:436] installing /etc/kubernetes/addons/yakd-sa.yaml
	I1124 08:44:25.946616 1655502 ssh_runner.go:362] scp yakd/yakd-sa.yaml --> /etc/kubernetes/addons/yakd-sa.yaml (247 bytes)
	I1124 08:44:25.976057 1655502 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1124 08:44:26.049398 1655502 addons.go:436] installing /etc/kubernetes/addons/yakd-crb.yaml
	I1124 08:44:26.049474 1655502 ssh_runner.go:362] scp yakd/yakd-crb.yaml --> /etc/kubernetes/addons/yakd-crb.yaml (422 bytes)
	I1124 08:44:26.055208 1655502 addons.go:436] installing /etc/kubernetes/addons/registry-svc.yaml
	I1124 08:44:26.055281 1655502 ssh_runner.go:362] scp registry/registry-svc.yaml --> /etc/kubernetes/addons/registry-svc.yaml (398 bytes)
	I1124 08:44:26.076507 1655502 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply -f /etc/kubernetes/addons/registry-creds-rc.yaml
	I1124 08:44:26.084786 1655502 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply -f /etc/kubernetes/addons/storage-provisioner-rancher.yaml
	I1124 08:44:26.095010 1655502 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply -f /etc/kubernetes/addons/nvidia-device-plugin.yaml
	I1124 08:44:26.127348 1655502 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply -f /etc/kubernetes/addons/deployment.yaml
	I1124 08:44:26.236754 1655502 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 08:44:26.262899 1655502 addons.go:436] installing /etc/kubernetes/addons/metrics-server-deployment.yaml
	I1124 08:44:26.262969 1655502 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/metrics-server-deployment.yaml (1907 bytes)
	I1124 08:44:26.287672 1655502 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply -f /etc/kubernetes/addons/amd-gpu-device-plugin.yaml
	I1124 08:44:26.295777 1655502 addons.go:436] installing /etc/kubernetes/addons/yakd-svc.yaml
	I1124 08:44:26.295850 1655502 ssh_runner.go:362] scp yakd/yakd-svc.yaml --> /etc/kubernetes/addons/yakd-svc.yaml (412 bytes)
	I1124 08:44:26.351051 1655502 addons.go:436] installing /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml
	I1124 08:44:26.351111 1655502 ssh_runner.go:362] scp volumesnapshots/snapshot.storage.k8s.io_volumesnapshotclasses.yaml --> /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml (6471 bytes)
	I1124 08:44:26.379485 1655502 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply -f /etc/kubernetes/addons/ig-deployment.yaml
	I1124 08:44:26.384799 1655502 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply -f /etc/kubernetes/addons/ingress-deploy.yaml
	I1124 08:44:26.387713 1655502 addons.go:436] installing /etc/kubernetes/addons/rbac-hostpath.yaml
	I1124 08:44:26.387783 1655502 ssh_runner.go:362] scp csi-hostpath-driver/rbac/rbac-hostpath.yaml --> /etc/kubernetes/addons/rbac-hostpath.yaml (4266 bytes)
	I1124 08:44:26.418307 1655502 addons.go:436] installing /etc/kubernetes/addons/registry-proxy.yaml
	I1124 08:44:26.418377 1655502 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/registry-proxy.yaml (947 bytes)
	I1124 08:44:26.491755 1655502 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply -f /etc/kubernetes/addons/ingress-dns-pod.yaml
	I1124 08:44:26.542346 1655502 addons.go:436] installing /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml
	I1124 08:44:26.542422 1655502 ssh_runner.go:362] scp volumesnapshots/snapshot.storage.k8s.io_volumesnapshotcontents.yaml --> /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml (23126 bytes)
	I1124 08:44:26.546137 1655502 addons.go:436] installing /etc/kubernetes/addons/metrics-server-rbac.yaml
	I1124 08:44:26.546220 1655502 ssh_runner.go:362] scp metrics-server/metrics-server-rbac.yaml --> /etc/kubernetes/addons/metrics-server-rbac.yaml (2175 bytes)
	I1124 08:44:26.578310 1655502 addons.go:436] installing /etc/kubernetes/addons/rbac-external-health-monitor-controller.yaml
	I1124 08:44:26.578384 1655502 ssh_runner.go:362] scp csi-hostpath-driver/rbac/rbac-external-health-monitor-controller.yaml --> /etc/kubernetes/addons/rbac-external-health-monitor-controller.yaml (3038 bytes)
	I1124 08:44:26.629881 1655502 addons.go:436] installing /etc/kubernetes/addons/metrics-server-service.yaml
	I1124 08:44:26.629960 1655502 ssh_runner.go:362] scp metrics-server/metrics-server-service.yaml --> /etc/kubernetes/addons/metrics-server-service.yaml (446 bytes)
	I1124 08:44:26.640111 1655502 addons.go:436] installing /etc/kubernetes/addons/yakd-dp.yaml
	I1124 08:44:26.640182 1655502 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/yakd-dp.yaml (2017 bytes)
	I1124 08:44:26.715830 1655502 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply -f /etc/kubernetes/addons/registry-rc.yaml -f /etc/kubernetes/addons/registry-svc.yaml -f /etc/kubernetes/addons/registry-proxy.yaml
	I1124 08:44:26.734574 1655502 addons.go:436] installing /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml
	I1124 08:44:26.734647 1655502 ssh_runner.go:362] scp volumesnapshots/snapshot.storage.k8s.io_volumesnapshots.yaml --> /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml (19582 bytes)
	I1124 08:44:26.763239 1655502 addons.go:436] installing /etc/kubernetes/addons/rbac-external-provisioner.yaml
	I1124 08:44:26.763314 1655502 ssh_runner.go:362] scp csi-hostpath-driver/rbac/rbac-external-provisioner.yaml --> /etc/kubernetes/addons/rbac-external-provisioner.yaml (4442 bytes)
	I1124 08:44:26.827760 1655502 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply -f /etc/kubernetes/addons/metrics-apiservice.yaml -f /etc/kubernetes/addons/metrics-server-deployment.yaml -f /etc/kubernetes/addons/metrics-server-rbac.yaml -f /etc/kubernetes/addons/metrics-server-service.yaml
	I1124 08:44:26.829095 1655502 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply -f /etc/kubernetes/addons/yakd-ns.yaml -f /etc/kubernetes/addons/yakd-sa.yaml -f /etc/kubernetes/addons/yakd-crb.yaml -f /etc/kubernetes/addons/yakd-svc.yaml -f /etc/kubernetes/addons/yakd-dp.yaml
	I1124 08:44:26.876851 1655502 addons.go:436] installing /etc/kubernetes/addons/rbac-external-resizer.yaml
	I1124 08:44:26.876929 1655502 ssh_runner.go:362] scp csi-hostpath-driver/rbac/rbac-external-resizer.yaml --> /etc/kubernetes/addons/rbac-external-resizer.yaml (2943 bytes)
	I1124 08:44:26.977858 1655502 addons.go:436] installing /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml
	I1124 08:44:26.977934 1655502 ssh_runner.go:362] scp volumesnapshots/rbac-volume-snapshot-controller.yaml --> /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml (3545 bytes)
	I1124 08:44:27.236706 1655502 addons.go:436] installing /etc/kubernetes/addons/rbac-external-snapshotter.yaml
	I1124 08:44:27.236780 1655502 ssh_runner.go:362] scp csi-hostpath-driver/rbac/rbac-external-snapshotter.yaml --> /etc/kubernetes/addons/rbac-external-snapshotter.yaml (3149 bytes)
	I1124 08:44:27.450159 1655502 addons.go:436] installing /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml
	I1124 08:44:27.450229 1655502 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml (1475 bytes)
	I1124 08:44:27.610639 1655502 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply -f /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml -f /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml -f /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml
	I1124 08:44:27.782705 1655502 addons.go:436] installing /etc/kubernetes/addons/csi-hostpath-attacher.yaml
	I1124 08:44:27.782789 1655502 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/csi-hostpath-attacher.yaml (2143 bytes)
	I1124 08:44:28.125871 1655502 addons.go:436] installing /etc/kubernetes/addons/csi-hostpath-driverinfo.yaml
	I1124 08:44:28.125892 1655502 ssh_runner.go:362] scp csi-hostpath-driver/deploy/csi-hostpath-driverinfo.yaml --> /etc/kubernetes/addons/csi-hostpath-driverinfo.yaml (1274 bytes)
	I1124 08:44:28.220946 1655502 ssh_runner.go:235] Completed: sudo systemctl start kubelet: (2.825206107s)
	I1124 08:44:28.221683 1655502 node_ready.go:35] waiting up to 6m0s for node "addons-674149" to be "Ready" ...
	I1124 08:44:28.221868 1655502 ssh_runner.go:235] Completed: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.34.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml | sed -e '/^        forward . \/etc\/resolv.conf.*/i \        hosts {\n           192.168.49.1 host.minikube.internal\n           fallthrough\n        }' -e '/^        errors *$/i \        log' | sudo /var/lib/minikube/binaries/v1.34.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig replace -f -": (2.82622456s)
	I1124 08:44:28.221882 1655502 start.go:977] {"host.minikube.internal": 192.168.49.1} host record injected into CoreDNS's ConfigMap
	I1124 08:44:28.296813 1655502 addons.go:436] installing /etc/kubernetes/addons/csi-hostpath-plugin.yaml
	I1124 08:44:28.296874 1655502 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/csi-hostpath-plugin.yaml (8201 bytes)
	I1124 08:44:28.457885 1655502 addons.go:436] installing /etc/kubernetes/addons/csi-hostpath-resizer.yaml
	I1124 08:44:28.457956 1655502 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/csi-hostpath-resizer.yaml (2191 bytes)
	I1124 08:44:28.471900 1655502 addons.go:436] installing /etc/kubernetes/addons/csi-hostpath-storageclass.yaml
	I1124 08:44:28.471982 1655502 ssh_runner.go:362] scp csi-hostpath-driver/deploy/csi-hostpath-storageclass.yaml --> /etc/kubernetes/addons/csi-hostpath-storageclass.yaml (846 bytes)
	I1124 08:44:28.485737 1655502 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply -f /etc/kubernetes/addons/rbac-external-attacher.yaml -f /etc/kubernetes/addons/rbac-hostpath.yaml -f /etc/kubernetes/addons/rbac-external-health-monitor-controller.yaml -f /etc/kubernetes/addons/rbac-external-provisioner.yaml -f /etc/kubernetes/addons/rbac-external-resizer.yaml -f /etc/kubernetes/addons/rbac-external-snapshotter.yaml -f /etc/kubernetes/addons/csi-hostpath-attacher.yaml -f /etc/kubernetes/addons/csi-hostpath-driverinfo.yaml -f /etc/kubernetes/addons/csi-hostpath-plugin.yaml -f /etc/kubernetes/addons/csi-hostpath-resizer.yaml -f /etc/kubernetes/addons/csi-hostpath-storageclass.yaml
	I1124 08:44:28.729509 1655502 kapi.go:214] "coredns" deployment in "kube-system" namespace and "addons-674149" context rescaled to 1 replicas
	W1124 08:44:30.225745 1655502 node_ready.go:57] node "addons-674149" has "Ready":"False" status (will retry)
	W1124 08:44:32.247989 1655502 node_ready.go:57] node "addons-674149" has "Ready":"False" status (will retry)
	I1124 08:44:32.546105 1655502 ssh_runner.go:362] scp memory --> /var/lib/minikube/google_application_credentials.json (162 bytes)
	I1124 08:44:32.546277 1655502 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-674149
	I1124 08:44:32.575272 1655502 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34664 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/addons-674149/id_rsa Username:docker}
	I1124 08:44:32.747537 1655502 ssh_runner.go:362] scp memory --> /var/lib/minikube/google_cloud_project (12 bytes)
	I1124 08:44:32.770918 1655502 addons.go:239] Setting addon gcp-auth=true in "addons-674149"
	I1124 08:44:32.771042 1655502 host.go:66] Checking if "addons-674149" exists ...
	I1124 08:44:32.771670 1655502 cli_runner.go:164] Run: docker container inspect addons-674149 --format={{.State.Status}}
	I1124 08:44:32.796655 1655502 ssh_runner.go:195] Run: cat /var/lib/minikube/google_application_credentials.json
	I1124 08:44:32.796710 1655502 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" addons-674149
	I1124 08:44:32.820195 1655502 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34664 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/addons-674149/id_rsa Username:docker}
	I1124 08:44:33.349974 1655502 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply -f /etc/kubernetes/addons/volcano-deployment.yaml: (7.466207353s)
	I1124 08:44:33.350100 1655502 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: (7.373973795s)
	I1124 08:44:33.350442 1655502 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply -f /etc/kubernetes/addons/registry-creds-rc.yaml: (7.273861214s)
	I1124 08:44:33.350581 1655502 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply -f /etc/kubernetes/addons/storage-provisioner-rancher.yaml: (7.265709842s)
	I1124 08:44:33.350757 1655502 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply -f /etc/kubernetes/addons/nvidia-device-plugin.yaml: (7.255678471s)
	I1124 08:44:33.350862 1655502 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply -f /etc/kubernetes/addons/deployment.yaml: (7.223439625s)
	I1124 08:44:33.350948 1655502 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: (7.114127516s)
	I1124 08:44:33.351034 1655502 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply -f /etc/kubernetes/addons/amd-gpu-device-plugin.yaml: (7.063294868s)
	I1124 08:44:33.351168 1655502 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply -f /etc/kubernetes/addons/ig-deployment.yaml: (6.971617127s)
	I1124 08:44:33.351349 1655502 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply -f /etc/kubernetes/addons/ingress-deploy.yaml: (6.966480992s)
	I1124 08:44:33.351396 1655502 addons.go:495] Verifying addon ingress=true in "addons-674149"
	I1124 08:44:33.351798 1655502 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply -f /etc/kubernetes/addons/ingress-dns-pod.yaml: (6.859872345s)
	I1124 08:44:33.352039 1655502 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply -f /etc/kubernetes/addons/registry-rc.yaml -f /etc/kubernetes/addons/registry-svc.yaml -f /etc/kubernetes/addons/registry-proxy.yaml: (6.636135541s)
	I1124 08:44:33.352599 1655502 addons.go:495] Verifying addon registry=true in "addons-674149"
	I1124 08:44:33.352152 1655502 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply -f /etc/kubernetes/addons/metrics-apiservice.yaml -f /etc/kubernetes/addons/metrics-server-deployment.yaml -f /etc/kubernetes/addons/metrics-server-rbac.yaml -f /etc/kubernetes/addons/metrics-server-service.yaml: (6.524321276s)
	I1124 08:44:33.352790 1655502 addons.go:495] Verifying addon metrics-server=true in "addons-674149"
	I1124 08:44:33.352198 1655502 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply -f /etc/kubernetes/addons/yakd-ns.yaml -f /etc/kubernetes/addons/yakd-sa.yaml -f /etc/kubernetes/addons/yakd-crb.yaml -f /etc/kubernetes/addons/yakd-svc.yaml -f /etc/kubernetes/addons/yakd-dp.yaml: (6.523042346s)
	I1124 08:44:33.352470 1655502 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply -f /etc/kubernetes/addons/rbac-external-attacher.yaml -f /etc/kubernetes/addons/rbac-hostpath.yaml -f /etc/kubernetes/addons/rbac-external-health-monitor-controller.yaml -f /etc/kubernetes/addons/rbac-external-provisioner.yaml -f /etc/kubernetes/addons/rbac-external-resizer.yaml -f /etc/kubernetes/addons/rbac-external-snapshotter.yaml -f /etc/kubernetes/addons/csi-hostpath-attacher.yaml -f /etc/kubernetes/addons/csi-hostpath-driverinfo.yaml -f /etc/kubernetes/addons/csi-hostpath-plugin.yaml -f /etc/kubernetes/addons/csi-hostpath-resizer.yaml -f /etc/kubernetes/addons/csi-hostpath-storageclass.yaml: (4.86665131s)
	I1124 08:44:33.354190 1655502 addons.go:495] Verifying addon csi-hostpath-driver=true in "addons-674149"
	I1124 08:44:33.352279 1655502 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply -f /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml -f /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml -f /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml: (5.741574336s)
	W1124 08:44:33.354552 1655502 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply -f /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml -f /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml -f /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml: Process exited with status 1
	stdout:
	customresourcedefinition.apiextensions.k8s.io/volumesnapshotclasses.snapshot.storage.k8s.io created
	customresourcedefinition.apiextensions.k8s.io/volumesnapshotcontents.snapshot.storage.k8s.io created
	customresourcedefinition.apiextensions.k8s.io/volumesnapshots.snapshot.storage.k8s.io created
	serviceaccount/snapshot-controller created
	clusterrole.rbac.authorization.k8s.io/snapshot-controller-runner created
	clusterrolebinding.rbac.authorization.k8s.io/snapshot-controller-role created
	role.rbac.authorization.k8s.io/snapshot-controller-leaderelection created
	rolebinding.rbac.authorization.k8s.io/snapshot-controller-leaderelection created
	deployment.apps/snapshot-controller created
	
	stderr:
	error: resource mapping not found for name: "csi-hostpath-snapclass" namespace: "" from "/etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml": no matches for kind "VolumeSnapshotClass" in version "snapshot.storage.k8s.io/v1"
	ensure CRDs are installed first
	I1124 08:44:33.354578 1655502 retry.go:31] will retry after 320.559417ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply -f /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml -f /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml -f /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml: Process exited with status 1
	stdout:
	customresourcedefinition.apiextensions.k8s.io/volumesnapshotclasses.snapshot.storage.k8s.io created
	customresourcedefinition.apiextensions.k8s.io/volumesnapshotcontents.snapshot.storage.k8s.io created
	customresourcedefinition.apiextensions.k8s.io/volumesnapshots.snapshot.storage.k8s.io created
	serviceaccount/snapshot-controller created
	clusterrole.rbac.authorization.k8s.io/snapshot-controller-runner created
	clusterrolebinding.rbac.authorization.k8s.io/snapshot-controller-role created
	role.rbac.authorization.k8s.io/snapshot-controller-leaderelection created
	rolebinding.rbac.authorization.k8s.io/snapshot-controller-leaderelection created
	deployment.apps/snapshot-controller created
	
	stderr:
	error: resource mapping not found for name: "csi-hostpath-snapclass" namespace: "" from "/etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml": no matches for kind "VolumeSnapshotClass" in version "snapshot.storage.k8s.io/v1"
	ensure CRDs are installed first
	I1124 08:44:33.354937 1655502 out.go:179] * Verifying ingress addon...
	I1124 08:44:33.357332 1655502 out.go:179] * Verifying registry addon...
	I1124 08:44:33.357384 1655502 out.go:179]   - Using image registry.k8s.io/ingress-nginx/kube-webhook-certgen:v1.6.4
	I1124 08:44:33.357409 1655502 out.go:179] * Verifying csi-hostpath-driver addon...
	I1124 08:44:33.357420 1655502 out.go:179] * To access YAKD - Kubernetes Dashboard, wait for Pod to be ready and run the following command:
	
		minikube -p addons-674149 service yakd-dashboard -n yakd-dashboard
	
	I1124 08:44:33.360241 1655502 kapi.go:75] Waiting for pod with label "app.kubernetes.io/name=ingress-nginx" in ns "ingress-nginx" ...
	I1124 08:44:33.362169 1655502 kapi.go:75] Waiting for pod with label "kubernetes.io/minikube-addons=registry" in ns "kube-system" ...
	I1124 08:44:33.363940 1655502 kapi.go:75] Waiting for pod with label "kubernetes.io/minikube-addons=csi-hostpath-driver" in ns "kube-system" ...
	I1124 08:44:33.366672 1655502 out.go:179]   - Using image gcr.io/k8s-minikube/gcp-auth-webhook:v0.1.3
	I1124 08:44:33.369688 1655502 addons.go:436] installing /etc/kubernetes/addons/gcp-auth-ns.yaml
	I1124 08:44:33.369769 1655502 ssh_runner.go:362] scp gcp-auth/gcp-auth-ns.yaml --> /etc/kubernetes/addons/gcp-auth-ns.yaml (700 bytes)
	I1124 08:44:33.387626 1655502 kapi.go:86] Found 1 Pods for label selector kubernetes.io/minikube-addons=registry
	I1124 08:44:33.387653 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:33.411281 1655502 kapi.go:86] Found 3 Pods for label selector app.kubernetes.io/name=ingress-nginx
	I1124 08:44:33.411310 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:33.411394 1655502 kapi.go:86] Found 2 Pods for label selector kubernetes.io/minikube-addons=csi-hostpath-driver
	I1124 08:44:33.411406 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:33.436035 1655502 addons.go:436] installing /etc/kubernetes/addons/gcp-auth-service.yaml
	I1124 08:44:33.436059 1655502 ssh_runner.go:362] scp gcp-auth/gcp-auth-service.yaml --> /etc/kubernetes/addons/gcp-auth-service.yaml (788 bytes)
	I1124 08:44:33.477877 1655502 addons.go:436] installing /etc/kubernetes/addons/gcp-auth-webhook.yaml
	I1124 08:44:33.477900 1655502 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/gcp-auth-webhook.yaml (5421 bytes)
	W1124 08:44:33.478623 1655502 out.go:285] ! Enabling 'default-storageclass' returned an error: running callbacks: [Error making standard the default storage class: Error while marking storage class csi-hostpath-sc as non-default: Operation cannot be fulfilled on storageclasses.storage.k8s.io "csi-hostpath-sc": the object has been modified; please apply your changes to the latest version and try again]
	I1124 08:44:33.515041 1655502 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply -f /etc/kubernetes/addons/gcp-auth-ns.yaml -f /etc/kubernetes/addons/gcp-auth-service.yaml -f /etc/kubernetes/addons/gcp-auth-webhook.yaml
	I1124 08:44:33.676164 1655502 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply --force -f /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml -f /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml -f /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml
	I1124 08:44:33.872971 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:33.873192 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:33.873276 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:34.366625 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:34.366842 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:34.368051 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:34.510537 1655502 addons.go:495] Verifying addon gcp-auth=true in "addons-674149"
	I1124 08:44:34.513679 1655502 out.go:179] * Verifying gcp-auth addon...
	I1124 08:44:34.517205 1655502 kapi.go:75] Waiting for pod with label "kubernetes.io/minikube-addons=gcp-auth" in ns "gcp-auth" ...
	I1124 08:44:34.520286 1655502 kapi.go:86] Found 1 Pods for label selector kubernetes.io/minikube-addons=gcp-auth
	I1124 08:44:34.520311 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	W1124 08:44:34.738756 1655502 node_ready.go:57] node "addons-674149" has "Ready":"False" status (will retry)
	I1124 08:44:34.750733 1655502 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply --force -f /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml -f /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml -f /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml: (1.074511853s)
	I1124 08:44:34.865135 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:34.865674 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:34.867591 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:35.020924 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:44:35.363832 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:35.366058 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:35.366649 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:35.520944 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:44:35.863647 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:35.868374 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:35.868578 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:36.020831 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:44:36.365358 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:36.365921 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:36.368206 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:36.520236 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:44:36.864840 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:36.866261 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:36.866864 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:37.020855 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	W1124 08:44:37.224474 1655502 node_ready.go:57] node "addons-674149" has "Ready":"False" status (will retry)
	I1124 08:44:37.364372 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:37.365910 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:37.367098 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:37.520065 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:44:37.864600 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:37.865281 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:37.867470 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:38.021038 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:44:38.365278 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:38.366970 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:38.368060 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:38.523233 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:44:38.864756 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:38.865218 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:38.866663 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:39.020772 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	W1124 08:44:39.224851 1655502 node_ready.go:57] node "addons-674149" has "Ready":"False" status (will retry)
	I1124 08:44:39.364255 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:39.365733 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:39.366904 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:39.521103 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:44:39.864835 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:39.865211 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:39.866299 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:40.023210 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:44:40.365390 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:40.366705 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:40.367766 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:40.520891 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:44:40.864323 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:40.866514 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:40.867045 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:41.021119 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	W1124 08:44:41.225025 1655502 node_ready.go:57] node "addons-674149" has "Ready":"False" status (will retry)
	I1124 08:44:41.366678 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:41.367591 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:41.368468 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:41.520464 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:44:41.863597 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:41.865713 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:41.867165 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:42.021049 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:44:42.365346 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:42.367050 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:42.367628 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:42.520729 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:44:42.863961 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:42.865848 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:42.866775 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:43.020839 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:44:43.363653 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:43.365899 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:43.367176 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:43.520456 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	W1124 08:44:43.727380 1655502 node_ready.go:57] node "addons-674149" has "Ready":"False" status (will retry)
	I1124 08:44:43.864600 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:43.865705 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:43.867321 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:44.020655 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:44:44.365426 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:44.367520 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:44.367681 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:44.521022 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:44:44.865136 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:44.865582 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:44.866859 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:45.026934 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:44:45.364052 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:45.366308 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:45.366363 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:45.520344 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:44:45.865842 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:45.865939 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:45.866923 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:46.021112 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	W1124 08:44:46.224974 1655502 node_ready.go:57] node "addons-674149" has "Ready":"False" status (will retry)
	I1124 08:44:46.364532 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:46.366846 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:46.367334 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:46.520863 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:44:46.864256 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:46.866399 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:46.867094 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:47.021250 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:44:47.364282 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:47.365616 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:47.366748 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:47.520764 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:44:47.864243 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:47.866093 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:47.866935 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:48.021654 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	W1124 08:44:48.225352 1655502 node_ready.go:57] node "addons-674149" has "Ready":"False" status (will retry)
	I1124 08:44:48.364617 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:48.366498 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:48.367025 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:48.520362 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:44:48.864348 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:48.864603 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:48.866639 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:49.020728 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:44:49.366359 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:49.366982 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:49.368199 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:49.520150 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:44:49.866057 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:49.866975 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:49.867795 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:50.030152 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:44:50.364073 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:50.366788 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:50.367198 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:50.521049 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	W1124 08:44:50.724907 1655502 node_ready.go:57] node "addons-674149" has "Ready":"False" status (will retry)
	I1124 08:44:50.864027 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:50.866506 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:50.866573 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:51.021117 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:44:51.363593 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:51.366243 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:51.367641 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:51.520840 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:44:51.863266 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:51.866195 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:51.866874 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:52.021344 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:44:52.364980 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:52.365114 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:52.367331 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:52.520267 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	W1124 08:44:52.725231 1655502 node_ready.go:57] node "addons-674149" has "Ready":"False" status (will retry)
	I1124 08:44:52.864327 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:52.865138 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:52.866811 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:53.020852 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:44:53.364502 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:53.364728 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:53.366274 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:53.520065 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:44:53.869157 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:53.885398 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:53.885603 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:54.020772 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:44:54.364455 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:54.365631 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:54.367290 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:54.520218 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	W1124 08:44:54.725530 1655502 node_ready.go:57] node "addons-674149" has "Ready":"False" status (will retry)
	I1124 08:44:54.863442 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:54.865144 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:54.866683 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:55.021161 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:44:55.364744 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:55.365422 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:55.367244 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:55.520058 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:44:55.865165 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:55.866538 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:55.868052 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:56.020600 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:44:56.363921 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:56.365361 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:56.366990 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:56.521812 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:44:56.863847 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:56.864983 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:56.866897 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:57.021215 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	W1124 08:44:57.224841 1655502 node_ready.go:57] node "addons-674149" has "Ready":"False" status (will retry)
	I1124 08:44:57.364473 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:57.374848 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:57.374997 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:57.520606 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:44:57.863346 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:57.866179 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:57.867367 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:58.020652 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:44:58.364983 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:58.365523 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:58.367163 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:58.520589 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:44:58.864271 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:58.865193 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:58.867576 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:59.020738 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:44:59.363217 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:59.365690 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:59.366570 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:44:59.520398 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	W1124 08:44:59.725462 1655502 node_ready.go:57] node "addons-674149" has "Ready":"False" status (will retry)
	I1124 08:44:59.863151 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:44:59.865598 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:44:59.866519 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:00.025126 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:00.414857 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:00.415585 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:00.427552 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:00.526439 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:00.864747 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:00.867661 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:00.867921 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:01.021416 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:01.364418 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:01.367897 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:01.368104 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:01.520451 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:01.864767 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:01.866293 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:01.867872 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:02.021659 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	W1124 08:45:02.224850 1655502 node_ready.go:57] node "addons-674149" has "Ready":"False" status (will retry)
	I1124 08:45:02.363869 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:02.367119 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:02.367769 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:02.520883 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:02.863709 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:02.866240 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:02.867183 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:03.020322 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:03.363784 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:03.366649 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:03.367525 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:03.521143 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:03.864406 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:03.865867 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:03.866807 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:04.021412 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	W1124 08:45:04.225809 1655502 node_ready.go:57] node "addons-674149" has "Ready":"False" status (will retry)
	I1124 08:45:04.365645 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:04.366577 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:04.367581 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:04.520490 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:04.864401 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:04.865732 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:04.867114 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:05.021216 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:05.364300 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:05.366449 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:05.367126 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:05.520420 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:05.863628 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:05.866253 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:05.867840 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:06.021248 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:06.225861 1655502 node_ready.go:49] node "addons-674149" is "Ready"
	I1124 08:45:06.225942 1655502 node_ready.go:38] duration metric: took 38.004235406s for node "addons-674149" to be "Ready" ...
	I1124 08:45:06.225971 1655502 api_server.go:52] waiting for apiserver process to appear ...
	I1124 08:45:06.226058 1655502 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 08:45:06.265974 1655502 api_server.go:72] duration metric: took 41.758922087s to wait for apiserver process to appear ...
	I1124 08:45:06.266045 1655502 api_server.go:88] waiting for apiserver healthz status ...
	I1124 08:45:06.266078 1655502 api_server.go:253] Checking apiserver healthz at https://192.168.49.2:8443/healthz ...
	I1124 08:45:06.294569 1655502 api_server.go:279] https://192.168.49.2:8443/healthz returned 200:
	ok
	I1124 08:45:06.308933 1655502 api_server.go:141] control plane version: v1.34.2
	I1124 08:45:06.308967 1655502 api_server.go:131] duration metric: took 42.902272ms to wait for apiserver health ...
	I1124 08:45:06.308976 1655502 system_pods.go:43] waiting for kube-system pods to appear ...
	I1124 08:45:06.326313 1655502 system_pods.go:59] 18 kube-system pods found
	I1124 08:45:06.326349 1655502 system_pods.go:61] "coredns-66bc5c9577-z7rx4" [9d258450-8e43-4a1d-83b9-acc957e7e84e] Pending
	I1124 08:45:06.326356 1655502 system_pods.go:61] "csi-hostpath-attacher-0" [db4ed975-28ec-4f06-b637-3f85549065bc] Pending
	I1124 08:45:06.326361 1655502 system_pods.go:61] "csi-hostpath-resizer-0" [45a8d51b-700a-4053-bcfc-006ebf44747f] Pending
	I1124 08:45:06.326367 1655502 system_pods.go:61] "etcd-addons-674149" [2e4e1dc4-c923-4884-b89d-4f018aeb63e0] Running
	I1124 08:45:06.326400 1655502 system_pods.go:61] "kindnet-r5bjv" [2456a3a4-c304-434f-a3d3-10a649ea0613] Running
	I1124 08:45:06.326413 1655502 system_pods.go:61] "kube-apiserver-addons-674149" [b564c2a6-1c00-4928-a89a-6597cbcc60d9] Running
	I1124 08:45:06.326418 1655502 system_pods.go:61] "kube-controller-manager-addons-674149" [a2adadce-c5ea-4279-8f4e-e8c204ac0f0e] Running
	I1124 08:45:06.326423 1655502 system_pods.go:61] "kube-ingress-dns-minikube" [9ecc5d6e-f8a8-4a39-afbb-34b9e1d839d8] Pending
	I1124 08:45:06.326426 1655502 system_pods.go:61] "kube-proxy-jzklp" [49037c7f-2825-40b9-a07c-1f95171da2ea] Running
	I1124 08:45:06.326437 1655502 system_pods.go:61] "kube-scheduler-addons-674149" [c8f105b3-a19c-4c03-bc3d-62a30d5b485a] Running
	I1124 08:45:06.326446 1655502 system_pods.go:61] "metrics-server-85b7d694d7-b9zbj" [a7c3b53c-5165-4497-928f-eb4e87ef1d5d] Pending / Ready:ContainersNotReady (containers with unready status: [metrics-server]) / ContainersReady:ContainersNotReady (containers with unready status: [metrics-server])
	I1124 08:45:06.326481 1655502 system_pods.go:61] "nvidia-device-plugin-daemonset-7dms5" [66c6813c-89d6-4546-b6ab-601ab40673c5] Pending
	I1124 08:45:06.326503 1655502 system_pods.go:61] "registry-6b586f9694-dlvbb" [bc6d1721-0552-4008-a2c9-c95ed3408c81] Pending
	I1124 08:45:06.326508 1655502 system_pods.go:61] "registry-creds-764b6fb674-jclkj" [edbb8fb0-f78a-4f96-99a3-61db80d1454e] Pending
	I1124 08:45:06.326511 1655502 system_pods.go:61] "registry-proxy-9kgmn" [f317f307-5762-4134-a0d0-d3ccd22e9b5d] Pending
	I1124 08:45:06.326516 1655502 system_pods.go:61] "snapshot-controller-7d9fbc56b8-7f9n5" [d8ae6b06-b7bb-470b-b43b-e0c9dfa97365] Pending
	I1124 08:45:06.326526 1655502 system_pods.go:61] "snapshot-controller-7d9fbc56b8-94tt9" [ae1cb8ed-db5a-4e57-85ee-0c75841e16b4] Pending
	I1124 08:45:06.326530 1655502 system_pods.go:61] "storage-provisioner" [b7e97ec5-7df5-4038-88d3-a9e525f999fb] Pending
	I1124 08:45:06.326536 1655502 system_pods.go:74] duration metric: took 17.522572ms to wait for pod list to return data ...
	I1124 08:45:06.326548 1655502 default_sa.go:34] waiting for default service account to be created ...
	I1124 08:45:06.348438 1655502 default_sa.go:45] found service account: "default"
	I1124 08:45:06.348467 1655502 default_sa.go:55] duration metric: took 21.913215ms for default service account to be created ...
	I1124 08:45:06.348479 1655502 system_pods.go:116] waiting for k8s-apps to be running ...
	I1124 08:45:06.379580 1655502 system_pods.go:86] 19 kube-system pods found
	I1124 08:45:06.379611 1655502 system_pods.go:89] "coredns-66bc5c9577-z7rx4" [9d258450-8e43-4a1d-83b9-acc957e7e84e] Pending
	I1124 08:45:06.379617 1655502 system_pods.go:89] "csi-hostpath-attacher-0" [db4ed975-28ec-4f06-b637-3f85549065bc] Pending
	I1124 08:45:06.379656 1655502 system_pods.go:89] "csi-hostpath-resizer-0" [45a8d51b-700a-4053-bcfc-006ebf44747f] Pending
	I1124 08:45:06.379670 1655502 system_pods.go:89] "csi-hostpathplugin-h9bf9" [67da1825-4079-4f6c-87f7-a437e37504b6] Pending
	I1124 08:45:06.379691 1655502 system_pods.go:89] "etcd-addons-674149" [2e4e1dc4-c923-4884-b89d-4f018aeb63e0] Running
	I1124 08:45:06.379697 1655502 system_pods.go:89] "kindnet-r5bjv" [2456a3a4-c304-434f-a3d3-10a649ea0613] Running
	I1124 08:45:06.379715 1655502 system_pods.go:89] "kube-apiserver-addons-674149" [b564c2a6-1c00-4928-a89a-6597cbcc60d9] Running
	I1124 08:45:06.379719 1655502 system_pods.go:89] "kube-controller-manager-addons-674149" [a2adadce-c5ea-4279-8f4e-e8c204ac0f0e] Running
	I1124 08:45:06.379724 1655502 system_pods.go:89] "kube-ingress-dns-minikube" [9ecc5d6e-f8a8-4a39-afbb-34b9e1d839d8] Pending
	I1124 08:45:06.379732 1655502 system_pods.go:89] "kube-proxy-jzklp" [49037c7f-2825-40b9-a07c-1f95171da2ea] Running
	I1124 08:45:06.379736 1655502 system_pods.go:89] "kube-scheduler-addons-674149" [c8f105b3-a19c-4c03-bc3d-62a30d5b485a] Running
	I1124 08:45:06.379744 1655502 system_pods.go:89] "metrics-server-85b7d694d7-b9zbj" [a7c3b53c-5165-4497-928f-eb4e87ef1d5d] Pending / Ready:ContainersNotReady (containers with unready status: [metrics-server]) / ContainersReady:ContainersNotReady (containers with unready status: [metrics-server])
	I1124 08:45:06.379764 1655502 system_pods.go:89] "nvidia-device-plugin-daemonset-7dms5" [66c6813c-89d6-4546-b6ab-601ab40673c5] Pending
	I1124 08:45:06.379774 1655502 system_pods.go:89] "registry-6b586f9694-dlvbb" [bc6d1721-0552-4008-a2c9-c95ed3408c81] Pending
	I1124 08:45:06.379778 1655502 system_pods.go:89] "registry-creds-764b6fb674-jclkj" [edbb8fb0-f78a-4f96-99a3-61db80d1454e] Pending
	I1124 08:45:06.379782 1655502 system_pods.go:89] "registry-proxy-9kgmn" [f317f307-5762-4134-a0d0-d3ccd22e9b5d] Pending
	I1124 08:45:06.379786 1655502 system_pods.go:89] "snapshot-controller-7d9fbc56b8-7f9n5" [d8ae6b06-b7bb-470b-b43b-e0c9dfa97365] Pending
	I1124 08:45:06.379797 1655502 system_pods.go:89] "snapshot-controller-7d9fbc56b8-94tt9" [ae1cb8ed-db5a-4e57-85ee-0c75841e16b4] Pending
	I1124 08:45:06.379801 1655502 system_pods.go:89] "storage-provisioner" [b7e97ec5-7df5-4038-88d3-a9e525f999fb] Pending
	I1124 08:45:06.379816 1655502 retry.go:31] will retry after 220.209185ms: missing components: kube-dns
	I1124 08:45:06.394894 1655502 kapi.go:86] Found 2 Pods for label selector kubernetes.io/minikube-addons=registry
	I1124 08:45:06.394919 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:06.395499 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:06.395918 1655502 kapi.go:86] Found 3 Pods for label selector kubernetes.io/minikube-addons=csi-hostpath-driver
	I1124 08:45:06.395937 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:06.522600 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:06.626889 1655502 system_pods.go:86] 19 kube-system pods found
	I1124 08:45:06.626941 1655502 system_pods.go:89] "coredns-66bc5c9577-z7rx4" [9d258450-8e43-4a1d-83b9-acc957e7e84e] Pending / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I1124 08:45:06.626949 1655502 system_pods.go:89] "csi-hostpath-attacher-0" [db4ed975-28ec-4f06-b637-3f85549065bc] Pending
	I1124 08:45:06.626985 1655502 system_pods.go:89] "csi-hostpath-resizer-0" [45a8d51b-700a-4053-bcfc-006ebf44747f] Pending
	I1124 08:45:06.626998 1655502 system_pods.go:89] "csi-hostpathplugin-h9bf9" [67da1825-4079-4f6c-87f7-a437e37504b6] Pending
	I1124 08:45:06.627027 1655502 system_pods.go:89] "etcd-addons-674149" [2e4e1dc4-c923-4884-b89d-4f018aeb63e0] Running
	I1124 08:45:06.627047 1655502 system_pods.go:89] "kindnet-r5bjv" [2456a3a4-c304-434f-a3d3-10a649ea0613] Running
	I1124 08:45:06.627061 1655502 system_pods.go:89] "kube-apiserver-addons-674149" [b564c2a6-1c00-4928-a89a-6597cbcc60d9] Running
	I1124 08:45:06.627067 1655502 system_pods.go:89] "kube-controller-manager-addons-674149" [a2adadce-c5ea-4279-8f4e-e8c204ac0f0e] Running
	I1124 08:45:06.627075 1655502 system_pods.go:89] "kube-ingress-dns-minikube" [9ecc5d6e-f8a8-4a39-afbb-34b9e1d839d8] Pending
	I1124 08:45:06.627079 1655502 system_pods.go:89] "kube-proxy-jzklp" [49037c7f-2825-40b9-a07c-1f95171da2ea] Running
	I1124 08:45:06.627083 1655502 system_pods.go:89] "kube-scheduler-addons-674149" [c8f105b3-a19c-4c03-bc3d-62a30d5b485a] Running
	I1124 08:45:06.627097 1655502 system_pods.go:89] "metrics-server-85b7d694d7-b9zbj" [a7c3b53c-5165-4497-928f-eb4e87ef1d5d] Pending / Ready:ContainersNotReady (containers with unready status: [metrics-server]) / ContainersReady:ContainersNotReady (containers with unready status: [metrics-server])
	I1124 08:45:06.627104 1655502 system_pods.go:89] "nvidia-device-plugin-daemonset-7dms5" [66c6813c-89d6-4546-b6ab-601ab40673c5] Pending
	I1124 08:45:06.627113 1655502 system_pods.go:89] "registry-6b586f9694-dlvbb" [bc6d1721-0552-4008-a2c9-c95ed3408c81] Pending
	I1124 08:45:06.627130 1655502 system_pods.go:89] "registry-creds-764b6fb674-jclkj" [edbb8fb0-f78a-4f96-99a3-61db80d1454e] Pending / Ready:ContainersNotReady (containers with unready status: [registry-creds]) / ContainersReady:ContainersNotReady (containers with unready status: [registry-creds])
	I1124 08:45:06.627141 1655502 system_pods.go:89] "registry-proxy-9kgmn" [f317f307-5762-4134-a0d0-d3ccd22e9b5d] Pending
	I1124 08:45:06.627146 1655502 system_pods.go:89] "snapshot-controller-7d9fbc56b8-7f9n5" [d8ae6b06-b7bb-470b-b43b-e0c9dfa97365] Pending
	I1124 08:45:06.627150 1655502 system_pods.go:89] "snapshot-controller-7d9fbc56b8-94tt9" [ae1cb8ed-db5a-4e57-85ee-0c75841e16b4] Pending
	I1124 08:45:06.627154 1655502 system_pods.go:89] "storage-provisioner" [b7e97ec5-7df5-4038-88d3-a9e525f999fb] Pending
	I1124 08:45:06.627169 1655502 retry.go:31] will retry after 260.647178ms: missing components: kube-dns
	I1124 08:45:06.870052 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:06.870257 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:06.870530 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:06.901584 1655502 system_pods.go:86] 19 kube-system pods found
	I1124 08:45:06.901621 1655502 system_pods.go:89] "coredns-66bc5c9577-z7rx4" [9d258450-8e43-4a1d-83b9-acc957e7e84e] Pending / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I1124 08:45:06.901630 1655502 system_pods.go:89] "csi-hostpath-attacher-0" [db4ed975-28ec-4f06-b637-3f85549065bc] Pending / Ready:ContainersNotReady (containers with unready status: [csi-attacher]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-attacher])
	I1124 08:45:06.901679 1655502 system_pods.go:89] "csi-hostpath-resizer-0" [45a8d51b-700a-4053-bcfc-006ebf44747f] Pending
	I1124 08:45:06.901684 1655502 system_pods.go:89] "csi-hostpathplugin-h9bf9" [67da1825-4079-4f6c-87f7-a437e37504b6] Pending
	I1124 08:45:06.901688 1655502 system_pods.go:89] "etcd-addons-674149" [2e4e1dc4-c923-4884-b89d-4f018aeb63e0] Running
	I1124 08:45:06.901695 1655502 system_pods.go:89] "kindnet-r5bjv" [2456a3a4-c304-434f-a3d3-10a649ea0613] Running
	I1124 08:45:06.901702 1655502 system_pods.go:89] "kube-apiserver-addons-674149" [b564c2a6-1c00-4928-a89a-6597cbcc60d9] Running
	I1124 08:45:06.901707 1655502 system_pods.go:89] "kube-controller-manager-addons-674149" [a2adadce-c5ea-4279-8f4e-e8c204ac0f0e] Running
	I1124 08:45:06.901717 1655502 system_pods.go:89] "kube-ingress-dns-minikube" [9ecc5d6e-f8a8-4a39-afbb-34b9e1d839d8] Pending / Ready:ContainersNotReady (containers with unready status: [minikube-ingress-dns]) / ContainersReady:ContainersNotReady (containers with unready status: [minikube-ingress-dns])
	I1124 08:45:06.901748 1655502 system_pods.go:89] "kube-proxy-jzklp" [49037c7f-2825-40b9-a07c-1f95171da2ea] Running
	I1124 08:45:06.901764 1655502 system_pods.go:89] "kube-scheduler-addons-674149" [c8f105b3-a19c-4c03-bc3d-62a30d5b485a] Running
	I1124 08:45:06.901772 1655502 system_pods.go:89] "metrics-server-85b7d694d7-b9zbj" [a7c3b53c-5165-4497-928f-eb4e87ef1d5d] Pending / Ready:ContainersNotReady (containers with unready status: [metrics-server]) / ContainersReady:ContainersNotReady (containers with unready status: [metrics-server])
	I1124 08:45:06.901776 1655502 system_pods.go:89] "nvidia-device-plugin-daemonset-7dms5" [66c6813c-89d6-4546-b6ab-601ab40673c5] Pending
	I1124 08:45:06.901788 1655502 system_pods.go:89] "registry-6b586f9694-dlvbb" [bc6d1721-0552-4008-a2c9-c95ed3408c81] Pending / Ready:ContainersNotReady (containers with unready status: [registry]) / ContainersReady:ContainersNotReady (containers with unready status: [registry])
	I1124 08:45:06.901793 1655502 system_pods.go:89] "registry-creds-764b6fb674-jclkj" [edbb8fb0-f78a-4f96-99a3-61db80d1454e] Pending / Ready:ContainersNotReady (containers with unready status: [registry-creds]) / ContainersReady:ContainersNotReady (containers with unready status: [registry-creds])
	I1124 08:45:06.901798 1655502 system_pods.go:89] "registry-proxy-9kgmn" [f317f307-5762-4134-a0d0-d3ccd22e9b5d] Pending
	I1124 08:45:06.901810 1655502 system_pods.go:89] "snapshot-controller-7d9fbc56b8-7f9n5" [d8ae6b06-b7bb-470b-b43b-e0c9dfa97365] Pending
	I1124 08:45:06.901826 1655502 system_pods.go:89] "snapshot-controller-7d9fbc56b8-94tt9" [ae1cb8ed-db5a-4e57-85ee-0c75841e16b4] Pending
	I1124 08:45:06.901839 1655502 system_pods.go:89] "storage-provisioner" [b7e97ec5-7df5-4038-88d3-a9e525f999fb] Pending
	I1124 08:45:06.901855 1655502 retry.go:31] will retry after 334.321289ms: missing components: kube-dns
	I1124 08:45:07.022528 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:07.242387 1655502 system_pods.go:86] 19 kube-system pods found
	I1124 08:45:07.242426 1655502 system_pods.go:89] "coredns-66bc5c9577-z7rx4" [9d258450-8e43-4a1d-83b9-acc957e7e84e] Pending / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I1124 08:45:07.242437 1655502 system_pods.go:89] "csi-hostpath-attacher-0" [db4ed975-28ec-4f06-b637-3f85549065bc] Pending / Ready:ContainersNotReady (containers with unready status: [csi-attacher]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-attacher])
	I1124 08:45:07.242444 1655502 system_pods.go:89] "csi-hostpath-resizer-0" [45a8d51b-700a-4053-bcfc-006ebf44747f] Pending / Ready:ContainersNotReady (containers with unready status: [csi-resizer]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-resizer])
	I1124 08:45:07.243487 1655502 system_pods.go:89] "csi-hostpathplugin-h9bf9" [67da1825-4079-4f6c-87f7-a437e37504b6] Pending / Ready:ContainersNotReady (containers with unready status: [csi-external-health-monitor-controller node-driver-registrar hostpath liveness-probe csi-provisioner csi-snapshotter]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-external-health-monitor-controller node-driver-registrar hostpath liveness-probe csi-provisioner csi-snapshotter])
	I1124 08:45:07.243501 1655502 system_pods.go:89] "etcd-addons-674149" [2e4e1dc4-c923-4884-b89d-4f018aeb63e0] Running
	I1124 08:45:07.243507 1655502 system_pods.go:89] "kindnet-r5bjv" [2456a3a4-c304-434f-a3d3-10a649ea0613] Running
	I1124 08:45:07.243512 1655502 system_pods.go:89] "kube-apiserver-addons-674149" [b564c2a6-1c00-4928-a89a-6597cbcc60d9] Running
	I1124 08:45:07.243517 1655502 system_pods.go:89] "kube-controller-manager-addons-674149" [a2adadce-c5ea-4279-8f4e-e8c204ac0f0e] Running
	I1124 08:45:07.243536 1655502 system_pods.go:89] "kube-ingress-dns-minikube" [9ecc5d6e-f8a8-4a39-afbb-34b9e1d839d8] Pending / Ready:ContainersNotReady (containers with unready status: [minikube-ingress-dns]) / ContainersReady:ContainersNotReady (containers with unready status: [minikube-ingress-dns])
	I1124 08:45:07.243548 1655502 system_pods.go:89] "kube-proxy-jzklp" [49037c7f-2825-40b9-a07c-1f95171da2ea] Running
	I1124 08:45:07.243554 1655502 system_pods.go:89] "kube-scheduler-addons-674149" [c8f105b3-a19c-4c03-bc3d-62a30d5b485a] Running
	I1124 08:45:07.243561 1655502 system_pods.go:89] "metrics-server-85b7d694d7-b9zbj" [a7c3b53c-5165-4497-928f-eb4e87ef1d5d] Pending / Ready:ContainersNotReady (containers with unready status: [metrics-server]) / ContainersReady:ContainersNotReady (containers with unready status: [metrics-server])
	I1124 08:45:07.243578 1655502 system_pods.go:89] "nvidia-device-plugin-daemonset-7dms5" [66c6813c-89d6-4546-b6ab-601ab40673c5] Pending / Ready:ContainersNotReady (containers with unready status: [nvidia-device-plugin-ctr]) / ContainersReady:ContainersNotReady (containers with unready status: [nvidia-device-plugin-ctr])
	I1124 08:45:07.243585 1655502 system_pods.go:89] "registry-6b586f9694-dlvbb" [bc6d1721-0552-4008-a2c9-c95ed3408c81] Pending / Ready:ContainersNotReady (containers with unready status: [registry]) / ContainersReady:ContainersNotReady (containers with unready status: [registry])
	I1124 08:45:07.243596 1655502 system_pods.go:89] "registry-creds-764b6fb674-jclkj" [edbb8fb0-f78a-4f96-99a3-61db80d1454e] Pending / Ready:ContainersNotReady (containers with unready status: [registry-creds]) / ContainersReady:ContainersNotReady (containers with unready status: [registry-creds])
	I1124 08:45:07.243623 1655502 system_pods.go:89] "registry-proxy-9kgmn" [f317f307-5762-4134-a0d0-d3ccd22e9b5d] Pending / Ready:ContainersNotReady (containers with unready status: [registry-proxy]) / ContainersReady:ContainersNotReady (containers with unready status: [registry-proxy])
	I1124 08:45:07.243637 1655502 system_pods.go:89] "snapshot-controller-7d9fbc56b8-7f9n5" [d8ae6b06-b7bb-470b-b43b-e0c9dfa97365] Pending / Ready:ContainersNotReady (containers with unready status: [volume-snapshot-controller]) / ContainersReady:ContainersNotReady (containers with unready status: [volume-snapshot-controller])
	I1124 08:45:07.243646 1655502 system_pods.go:89] "snapshot-controller-7d9fbc56b8-94tt9" [ae1cb8ed-db5a-4e57-85ee-0c75841e16b4] Pending / Ready:ContainersNotReady (containers with unready status: [volume-snapshot-controller]) / ContainersReady:ContainersNotReady (containers with unready status: [volume-snapshot-controller])
	I1124 08:45:07.243652 1655502 system_pods.go:89] "storage-provisioner" [b7e97ec5-7df5-4038-88d3-a9e525f999fb] Pending / Ready:ContainersNotReady (containers with unready status: [storage-provisioner]) / ContainersReady:ContainersNotReady (containers with unready status: [storage-provisioner])
	I1124 08:45:07.243672 1655502 retry.go:31] will retry after 414.380006ms: missing components: kube-dns
	I1124 08:45:07.366551 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:07.366750 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:07.369195 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:07.524534 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:07.663819 1655502 system_pods.go:86] 19 kube-system pods found
	I1124 08:45:07.663859 1655502 system_pods.go:89] "coredns-66bc5c9577-z7rx4" [9d258450-8e43-4a1d-83b9-acc957e7e84e] Pending / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I1124 08:45:07.663894 1655502 system_pods.go:89] "csi-hostpath-attacher-0" [db4ed975-28ec-4f06-b637-3f85549065bc] Pending / Ready:ContainersNotReady (containers with unready status: [csi-attacher]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-attacher])
	I1124 08:45:07.663911 1655502 system_pods.go:89] "csi-hostpath-resizer-0" [45a8d51b-700a-4053-bcfc-006ebf44747f] Pending / Ready:ContainersNotReady (containers with unready status: [csi-resizer]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-resizer])
	I1124 08:45:07.663921 1655502 system_pods.go:89] "csi-hostpathplugin-h9bf9" [67da1825-4079-4f6c-87f7-a437e37504b6] Pending / Ready:ContainersNotReady (containers with unready status: [csi-external-health-monitor-controller node-driver-registrar hostpath liveness-probe csi-provisioner csi-snapshotter]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-external-health-monitor-controller node-driver-registrar hostpath liveness-probe csi-provisioner csi-snapshotter])
	I1124 08:45:07.663933 1655502 system_pods.go:89] "etcd-addons-674149" [2e4e1dc4-c923-4884-b89d-4f018aeb63e0] Running
	I1124 08:45:07.663939 1655502 system_pods.go:89] "kindnet-r5bjv" [2456a3a4-c304-434f-a3d3-10a649ea0613] Running
	I1124 08:45:07.663943 1655502 system_pods.go:89] "kube-apiserver-addons-674149" [b564c2a6-1c00-4928-a89a-6597cbcc60d9] Running
	I1124 08:45:07.663948 1655502 system_pods.go:89] "kube-controller-manager-addons-674149" [a2adadce-c5ea-4279-8f4e-e8c204ac0f0e] Running
	I1124 08:45:07.663971 1655502 system_pods.go:89] "kube-ingress-dns-minikube" [9ecc5d6e-f8a8-4a39-afbb-34b9e1d839d8] Pending / Ready:ContainersNotReady (containers with unready status: [minikube-ingress-dns]) / ContainersReady:ContainersNotReady (containers with unready status: [minikube-ingress-dns])
	I1124 08:45:07.663985 1655502 system_pods.go:89] "kube-proxy-jzklp" [49037c7f-2825-40b9-a07c-1f95171da2ea] Running
	I1124 08:45:07.663990 1655502 system_pods.go:89] "kube-scheduler-addons-674149" [c8f105b3-a19c-4c03-bc3d-62a30d5b485a] Running
	I1124 08:45:07.663997 1655502 system_pods.go:89] "metrics-server-85b7d694d7-b9zbj" [a7c3b53c-5165-4497-928f-eb4e87ef1d5d] Pending / Ready:ContainersNotReady (containers with unready status: [metrics-server]) / ContainersReady:ContainersNotReady (containers with unready status: [metrics-server])
	I1124 08:45:07.664010 1655502 system_pods.go:89] "nvidia-device-plugin-daemonset-7dms5" [66c6813c-89d6-4546-b6ab-601ab40673c5] Pending / Ready:ContainersNotReady (containers with unready status: [nvidia-device-plugin-ctr]) / ContainersReady:ContainersNotReady (containers with unready status: [nvidia-device-plugin-ctr])
	I1124 08:45:07.664019 1655502 system_pods.go:89] "registry-6b586f9694-dlvbb" [bc6d1721-0552-4008-a2c9-c95ed3408c81] Pending / Ready:ContainersNotReady (containers with unready status: [registry]) / ContainersReady:ContainersNotReady (containers with unready status: [registry])
	I1124 08:45:07.664028 1655502 system_pods.go:89] "registry-creds-764b6fb674-jclkj" [edbb8fb0-f78a-4f96-99a3-61db80d1454e] Pending / Ready:ContainersNotReady (containers with unready status: [registry-creds]) / ContainersReady:ContainersNotReady (containers with unready status: [registry-creds])
	I1124 08:45:07.664047 1655502 system_pods.go:89] "registry-proxy-9kgmn" [f317f307-5762-4134-a0d0-d3ccd22e9b5d] Pending / Ready:ContainersNotReady (containers with unready status: [registry-proxy]) / ContainersReady:ContainersNotReady (containers with unready status: [registry-proxy])
	I1124 08:45:07.664089 1655502 system_pods.go:89] "snapshot-controller-7d9fbc56b8-7f9n5" [d8ae6b06-b7bb-470b-b43b-e0c9dfa97365] Pending / Ready:ContainersNotReady (containers with unready status: [volume-snapshot-controller]) / ContainersReady:ContainersNotReady (containers with unready status: [volume-snapshot-controller])
	I1124 08:45:07.664110 1655502 system_pods.go:89] "snapshot-controller-7d9fbc56b8-94tt9" [ae1cb8ed-db5a-4e57-85ee-0c75841e16b4] Pending / Ready:ContainersNotReady (containers with unready status: [volume-snapshot-controller]) / ContainersReady:ContainersNotReady (containers with unready status: [volume-snapshot-controller])
	I1124 08:45:07.664128 1655502 system_pods.go:89] "storage-provisioner" [b7e97ec5-7df5-4038-88d3-a9e525f999fb] Pending / Ready:ContainersNotReady (containers with unready status: [storage-provisioner]) / ContainersReady:ContainersNotReady (containers with unready status: [storage-provisioner])
	I1124 08:45:07.664148 1655502 retry.go:31] will retry after 477.561909ms: missing components: kube-dns
	I1124 08:45:07.868372 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:07.869077 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:07.869550 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:08.021649 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:08.149132 1655502 system_pods.go:86] 19 kube-system pods found
	I1124 08:45:08.149173 1655502 system_pods.go:89] "coredns-66bc5c9577-z7rx4" [9d258450-8e43-4a1d-83b9-acc957e7e84e] Pending / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I1124 08:45:08.149182 1655502 system_pods.go:89] "csi-hostpath-attacher-0" [db4ed975-28ec-4f06-b637-3f85549065bc] Pending / Ready:ContainersNotReady (containers with unready status: [csi-attacher]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-attacher])
	I1124 08:45:08.149219 1655502 system_pods.go:89] "csi-hostpath-resizer-0" [45a8d51b-700a-4053-bcfc-006ebf44747f] Pending / Ready:ContainersNotReady (containers with unready status: [csi-resizer]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-resizer])
	I1124 08:45:08.149236 1655502 system_pods.go:89] "csi-hostpathplugin-h9bf9" [67da1825-4079-4f6c-87f7-a437e37504b6] Pending / Ready:ContainersNotReady (containers with unready status: [csi-external-health-monitor-controller node-driver-registrar hostpath liveness-probe csi-provisioner csi-snapshotter]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-external-health-monitor-controller node-driver-registrar hostpath liveness-probe csi-provisioner csi-snapshotter])
	I1124 08:45:08.149242 1655502 system_pods.go:89] "etcd-addons-674149" [2e4e1dc4-c923-4884-b89d-4f018aeb63e0] Running
	I1124 08:45:08.149257 1655502 system_pods.go:89] "kindnet-r5bjv" [2456a3a4-c304-434f-a3d3-10a649ea0613] Running
	I1124 08:45:08.149261 1655502 system_pods.go:89] "kube-apiserver-addons-674149" [b564c2a6-1c00-4928-a89a-6597cbcc60d9] Running
	I1124 08:45:08.149265 1655502 system_pods.go:89] "kube-controller-manager-addons-674149" [a2adadce-c5ea-4279-8f4e-e8c204ac0f0e] Running
	I1124 08:45:08.149271 1655502 system_pods.go:89] "kube-ingress-dns-minikube" [9ecc5d6e-f8a8-4a39-afbb-34b9e1d839d8] Pending / Ready:ContainersNotReady (containers with unready status: [minikube-ingress-dns]) / ContainersReady:ContainersNotReady (containers with unready status: [minikube-ingress-dns])
	I1124 08:45:08.149305 1655502 system_pods.go:89] "kube-proxy-jzklp" [49037c7f-2825-40b9-a07c-1f95171da2ea] Running
	I1124 08:45:08.149318 1655502 system_pods.go:89] "kube-scheduler-addons-674149" [c8f105b3-a19c-4c03-bc3d-62a30d5b485a] Running
	I1124 08:45:08.149325 1655502 system_pods.go:89] "metrics-server-85b7d694d7-b9zbj" [a7c3b53c-5165-4497-928f-eb4e87ef1d5d] Pending / Ready:ContainersNotReady (containers with unready status: [metrics-server]) / ContainersReady:ContainersNotReady (containers with unready status: [metrics-server])
	I1124 08:45:08.149331 1655502 system_pods.go:89] "nvidia-device-plugin-daemonset-7dms5" [66c6813c-89d6-4546-b6ab-601ab40673c5] Pending / Ready:ContainersNotReady (containers with unready status: [nvidia-device-plugin-ctr]) / ContainersReady:ContainersNotReady (containers with unready status: [nvidia-device-plugin-ctr])
	I1124 08:45:08.149349 1655502 system_pods.go:89] "registry-6b586f9694-dlvbb" [bc6d1721-0552-4008-a2c9-c95ed3408c81] Pending / Ready:ContainersNotReady (containers with unready status: [registry]) / ContainersReady:ContainersNotReady (containers with unready status: [registry])
	I1124 08:45:08.149357 1655502 system_pods.go:89] "registry-creds-764b6fb674-jclkj" [edbb8fb0-f78a-4f96-99a3-61db80d1454e] Pending / Ready:ContainersNotReady (containers with unready status: [registry-creds]) / ContainersReady:ContainersNotReady (containers with unready status: [registry-creds])
	I1124 08:45:08.149364 1655502 system_pods.go:89] "registry-proxy-9kgmn" [f317f307-5762-4134-a0d0-d3ccd22e9b5d] Pending / Ready:ContainersNotReady (containers with unready status: [registry-proxy]) / ContainersReady:ContainersNotReady (containers with unready status: [registry-proxy])
	I1124 08:45:08.149386 1655502 system_pods.go:89] "snapshot-controller-7d9fbc56b8-7f9n5" [d8ae6b06-b7bb-470b-b43b-e0c9dfa97365] Pending / Ready:ContainersNotReady (containers with unready status: [volume-snapshot-controller]) / ContainersReady:ContainersNotReady (containers with unready status: [volume-snapshot-controller])
	I1124 08:45:08.149408 1655502 system_pods.go:89] "snapshot-controller-7d9fbc56b8-94tt9" [ae1cb8ed-db5a-4e57-85ee-0c75841e16b4] Pending / Ready:ContainersNotReady (containers with unready status: [volume-snapshot-controller]) / ContainersReady:ContainersNotReady (containers with unready status: [volume-snapshot-controller])
	I1124 08:45:08.149418 1655502 system_pods.go:89] "storage-provisioner" [b7e97ec5-7df5-4038-88d3-a9e525f999fb] Pending / Ready:ContainersNotReady (containers with unready status: [storage-provisioner]) / ContainersReady:ContainersNotReady (containers with unready status: [storage-provisioner])
	I1124 08:45:08.149434 1655502 retry.go:31] will retry after 704.806848ms: missing components: kube-dns
	I1124 08:45:08.366182 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:08.386219 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:08.386358 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:08.520675 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:08.859193 1655502 system_pods.go:86] 19 kube-system pods found
	I1124 08:45:08.859232 1655502 system_pods.go:89] "coredns-66bc5c9577-z7rx4" [9d258450-8e43-4a1d-83b9-acc957e7e84e] Running
	I1124 08:45:08.859279 1655502 system_pods.go:89] "csi-hostpath-attacher-0" [db4ed975-28ec-4f06-b637-3f85549065bc] Pending / Ready:ContainersNotReady (containers with unready status: [csi-attacher]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-attacher])
	I1124 08:45:08.859294 1655502 system_pods.go:89] "csi-hostpath-resizer-0" [45a8d51b-700a-4053-bcfc-006ebf44747f] Pending / Ready:ContainersNotReady (containers with unready status: [csi-resizer]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-resizer])
	I1124 08:45:08.859304 1655502 system_pods.go:89] "csi-hostpathplugin-h9bf9" [67da1825-4079-4f6c-87f7-a437e37504b6] Pending / Ready:ContainersNotReady (containers with unready status: [csi-external-health-monitor-controller node-driver-registrar hostpath liveness-probe csi-provisioner csi-snapshotter]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-external-health-monitor-controller node-driver-registrar hostpath liveness-probe csi-provisioner csi-snapshotter])
	I1124 08:45:08.859314 1655502 system_pods.go:89] "etcd-addons-674149" [2e4e1dc4-c923-4884-b89d-4f018aeb63e0] Running
	I1124 08:45:08.859319 1655502 system_pods.go:89] "kindnet-r5bjv" [2456a3a4-c304-434f-a3d3-10a649ea0613] Running
	I1124 08:45:08.859324 1655502 system_pods.go:89] "kube-apiserver-addons-674149" [b564c2a6-1c00-4928-a89a-6597cbcc60d9] Running
	I1124 08:45:08.859355 1655502 system_pods.go:89] "kube-controller-manager-addons-674149" [a2adadce-c5ea-4279-8f4e-e8c204ac0f0e] Running
	I1124 08:45:08.859371 1655502 system_pods.go:89] "kube-ingress-dns-minikube" [9ecc5d6e-f8a8-4a39-afbb-34b9e1d839d8] Pending / Ready:ContainersNotReady (containers with unready status: [minikube-ingress-dns]) / ContainersReady:ContainersNotReady (containers with unready status: [minikube-ingress-dns])
	I1124 08:45:08.859375 1655502 system_pods.go:89] "kube-proxy-jzklp" [49037c7f-2825-40b9-a07c-1f95171da2ea] Running
	I1124 08:45:08.859380 1655502 system_pods.go:89] "kube-scheduler-addons-674149" [c8f105b3-a19c-4c03-bc3d-62a30d5b485a] Running
	I1124 08:45:08.859398 1655502 system_pods.go:89] "metrics-server-85b7d694d7-b9zbj" [a7c3b53c-5165-4497-928f-eb4e87ef1d5d] Pending / Ready:ContainersNotReady (containers with unready status: [metrics-server]) / ContainersReady:ContainersNotReady (containers with unready status: [metrics-server])
	I1124 08:45:08.859412 1655502 system_pods.go:89] "nvidia-device-plugin-daemonset-7dms5" [66c6813c-89d6-4546-b6ab-601ab40673c5] Pending / Ready:ContainersNotReady (containers with unready status: [nvidia-device-plugin-ctr]) / ContainersReady:ContainersNotReady (containers with unready status: [nvidia-device-plugin-ctr])
	I1124 08:45:08.859451 1655502 system_pods.go:89] "registry-6b586f9694-dlvbb" [bc6d1721-0552-4008-a2c9-c95ed3408c81] Pending / Ready:ContainersNotReady (containers with unready status: [registry]) / ContainersReady:ContainersNotReady (containers with unready status: [registry])
	I1124 08:45:08.859457 1655502 system_pods.go:89] "registry-creds-764b6fb674-jclkj" [edbb8fb0-f78a-4f96-99a3-61db80d1454e] Pending / Ready:ContainersNotReady (containers with unready status: [registry-creds]) / ContainersReady:ContainersNotReady (containers with unready status: [registry-creds])
	I1124 08:45:08.859490 1655502 system_pods.go:89] "registry-proxy-9kgmn" [f317f307-5762-4134-a0d0-d3ccd22e9b5d] Pending / Ready:ContainersNotReady (containers with unready status: [registry-proxy]) / ContainersReady:ContainersNotReady (containers with unready status: [registry-proxy])
	I1124 08:45:08.859505 1655502 system_pods.go:89] "snapshot-controller-7d9fbc56b8-7f9n5" [d8ae6b06-b7bb-470b-b43b-e0c9dfa97365] Pending / Ready:ContainersNotReady (containers with unready status: [volume-snapshot-controller]) / ContainersReady:ContainersNotReady (containers with unready status: [volume-snapshot-controller])
	I1124 08:45:08.859513 1655502 system_pods.go:89] "snapshot-controller-7d9fbc56b8-94tt9" [ae1cb8ed-db5a-4e57-85ee-0c75841e16b4] Pending / Ready:ContainersNotReady (containers with unready status: [volume-snapshot-controller]) / ContainersReady:ContainersNotReady (containers with unready status: [volume-snapshot-controller])
	I1124 08:45:08.859521 1655502 system_pods.go:89] "storage-provisioner" [b7e97ec5-7df5-4038-88d3-a9e525f999fb] Running
	I1124 08:45:08.859530 1655502 system_pods.go:126] duration metric: took 2.51102659s to wait for k8s-apps to be running ...
	I1124 08:45:08.859540 1655502 system_svc.go:44] waiting for kubelet service to be running ....
	I1124 08:45:08.859613 1655502 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1124 08:45:08.864717 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:08.866662 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:08.868772 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:08.884604 1655502 system_svc.go:56] duration metric: took 25.05546ms WaitForService to wait for kubelet
	I1124 08:45:08.884639 1655502 kubeadm.go:587] duration metric: took 44.377590072s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1124 08:45:08.884659 1655502 node_conditions.go:102] verifying NodePressure condition ...
	I1124 08:45:08.887871 1655502 node_conditions.go:122] node storage ephemeral capacity is 203034800Ki
	I1124 08:45:08.887906 1655502 node_conditions.go:123] node cpu capacity is 2
	I1124 08:45:08.887920 1655502 node_conditions.go:105] duration metric: took 3.213984ms to run NodePressure ...
	I1124 08:45:08.887953 1655502 start.go:242] waiting for startup goroutines ...
	I1124 08:45:09.021621 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:09.367301 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:09.367736 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:09.369767 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:09.521379 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:09.864457 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:09.865869 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:09.867505 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:10.021679 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:10.374181 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:10.374492 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:10.374644 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:10.521070 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:10.866304 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:10.866523 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:10.868735 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:11.021314 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:11.369085 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:11.369479 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:11.370993 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:11.521138 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:11.867649 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:11.867870 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:11.869432 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:12.020688 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:12.365523 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:12.365644 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:12.367801 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:12.521820 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:12.871113 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:12.873167 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:12.883174 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:13.021887 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:13.364066 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:13.367785 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:13.368866 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:13.521125 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:13.869336 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:13.869823 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:13.873108 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:14.021954 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:14.368649 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:14.369155 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:14.369623 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:14.521668 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:14.865492 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:14.872425 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:14.873086 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:15.023566 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:15.364113 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:15.365569 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:15.366750 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:15.520683 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:15.864461 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:15.867330 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:15.868380 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:16.020838 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:16.364662 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:16.369446 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:16.369908 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:16.521371 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:16.863456 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:16.866248 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:16.867453 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:17.020528 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:17.363218 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:17.366555 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:17.367762 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:17.520684 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:17.866138 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:17.866262 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:17.868193 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:18.021002 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:18.363838 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:18.366890 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:18.374855 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:18.523987 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:18.868386 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:18.868471 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:18.869142 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:19.020796 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:19.371133 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:19.371622 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:19.372045 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:19.521917 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:19.865661 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:19.865761 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:19.867759 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:20.069642 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:20.364715 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:20.367141 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:20.367285 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:20.521003 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:20.872805 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:20.873289 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:20.873789 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:21.021648 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:21.366182 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:21.369574 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:21.369663 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:21.520659 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:21.866865 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:21.867198 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:21.867555 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:22.020858 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:22.370511 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:22.370646 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:22.370765 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:22.529899 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:22.865798 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:22.865990 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:22.867900 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:23.021156 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:23.368127 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:23.368568 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:23.370955 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:23.521752 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:23.871707 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:23.873386 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:23.969761 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:24.021908 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:24.364502 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:24.369525 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:24.370059 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:24.520120 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:24.869912 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:24.870720 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:24.870957 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:25.024214 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:25.363642 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:25.367330 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:25.367609 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:25.520750 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:25.867834 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:25.869036 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:25.897309 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:26.020662 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:26.364651 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:26.365997 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:26.367477 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:26.520558 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:26.868765 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:26.870882 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:26.885025 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:27.020956 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:27.366182 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:27.366601 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:27.367602 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:27.521046 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:27.863483 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:27.867192 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:27.867762 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:28.021403 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:28.364687 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:28.368831 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:28.369017 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:28.522408 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:28.875446 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:28.875641 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:28.875787 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:29.020810 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:29.364969 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:29.367510 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:29.367617 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:29.520534 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:29.874072 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:29.874344 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:29.874553 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:30.025177 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:30.365647 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:30.369614 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:30.370305 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:30.520915 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:30.868626 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:30.868986 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:30.870518 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:31.037959 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:31.367258 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:31.372522 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:31.373881 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:31.521451 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:31.864542 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:31.868854 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:31.868958 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:32.038921 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:32.365466 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:32.365581 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:32.367564 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:32.520702 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:32.864456 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:32.867037 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:32.867222 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:33.020464 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:33.364036 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:33.366435 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:33.367101 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:33.520382 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:33.868284 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:33.868776 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:33.870050 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:34.020689 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:34.364150 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:34.367405 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:34.367618 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:34.522191 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:34.865061 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:34.865272 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:34.866929 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:35.022569 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:35.363944 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:35.366026 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:35.367408 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:35.520466 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:35.865174 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:35.866796 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:35.866923 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:36.021830 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:36.364925 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:36.365316 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:36.367833 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:36.521939 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:36.866761 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:36.867140 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:36.868168 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:37.021526 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:37.366354 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:37.366804 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:37.369614 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:37.520635 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:37.866833 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:37.866908 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:37.868089 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:38.022924 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:38.365415 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:38.367334 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:38.367184 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:38.522582 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:38.865200 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:38.865251 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:38.867386 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:39.020636 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:39.365336 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:39.366437 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:39.368143 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:39.520804 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:39.865554 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:39.867783 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:39.868406 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:40.028019 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:40.367036 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:40.367291 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:40.368543 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:40.525338 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:40.865864 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:40.866110 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:40.868396 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:41.020901 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:41.363893 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:41.366234 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:41.368458 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:41.520236 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:41.866431 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:41.867103 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:41.869175 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:42.021803 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:42.366241 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:42.367711 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:42.369603 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:42.521422 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:42.866156 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:42.867285 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:42.868931 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:43.022640 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:43.369247 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:43.370608 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:43.371143 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:43.519855 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:43.867892 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I1124 08:45:43.869273 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:43.869339 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:44.021293 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:44.364944 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:44.366576 1655502 kapi.go:107] duration metric: took 1m11.004409111s to wait for kubernetes.io/minikube-addons=registry ...
	I1124 08:45:44.367917 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:44.521032 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:44.864843 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:44.867837 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:45.046263 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:45.364734 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:45.368183 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:45.526253 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:45.864719 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:45.867893 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:46.021240 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:46.365083 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:46.368695 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:46.522899 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:46.865165 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:46.868837 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:47.021210 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:47.363852 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:47.367609 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:47.520161 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:47.865285 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:47.868230 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:48.021111 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:48.363950 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:48.368013 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:48.521282 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:48.864345 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:48.867680 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:49.022648 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:49.364309 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:49.366588 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:49.524048 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:49.864721 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:49.868251 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:50.023705 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:50.365230 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:50.368705 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:50.520862 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:50.865858 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:50.867955 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:51.021921 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:51.363916 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:51.367858 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:51.521199 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:51.864948 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:51.867600 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:52.021305 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:52.363749 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:52.367940 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:52.521391 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:52.864270 1655502 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I1124 08:45:52.866815 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:53.020982 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:53.364893 1655502 kapi.go:107] duration metric: took 1m20.004650386s to wait for app.kubernetes.io/name=ingress-nginx ...
	I1124 08:45:53.367082 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:53.520696 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:53.868364 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:54.026396 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:54.368275 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:54.520050 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:54.867935 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:55.022265 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:55.368072 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:55.520254 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:55.871633 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:56.037662 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I1124 08:45:56.377441 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:56.520343 1655502 kapi.go:107] duration metric: took 1m22.003137798s to wait for kubernetes.io/minikube-addons=gcp-auth ...
	I1124 08:45:56.523300 1655502 out.go:179] * Your GCP credentials will now be mounted into every pod created in the addons-674149 cluster.
	I1124 08:45:56.526205 1655502 out.go:179] * If you don't want your credentials mounted into a specific pod, add a label with the `gcp-auth-skip-secret` key to your pod configuration.
	I1124 08:45:56.529125 1655502 out.go:179] * If you want existing pods to be mounted with credentials, either recreate them or rerun addons enable with --refresh.
	I1124 08:45:56.867760 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:57.405266 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:57.868117 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:58.367623 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:58.867822 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:59.368105 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:45:59.867266 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:46:00.368805 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:46:00.868531 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:46:01.368075 1655502 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I1124 08:46:01.869373 1655502 kapi.go:107] duration metric: took 1m28.50542876s to wait for kubernetes.io/minikube-addons=csi-hostpath-driver ...
	I1124 08:46:01.874015 1655502 out.go:179] * Enabled addons: volcano, registry-creds, nvidia-device-plugin, cloud-spanner, storage-provisioner, amd-gpu-device-plugin, inspektor-gadget, ingress-dns, metrics-server, yakd, storage-provisioner-rancher, volumesnapshots, registry, ingress, gcp-auth, csi-hostpath-driver
	I1124 08:46:01.877552 1655502 addons.go:530] duration metric: took 1m37.369382124s for enable addons: enabled=[volcano registry-creds nvidia-device-plugin cloud-spanner storage-provisioner amd-gpu-device-plugin inspektor-gadget ingress-dns metrics-server yakd storage-provisioner-rancher volumesnapshots registry ingress gcp-auth csi-hostpath-driver]
	I1124 08:46:01.877639 1655502 start.go:247] waiting for cluster config update ...
	I1124 08:46:01.877663 1655502 start.go:256] writing updated cluster config ...
	I1124 08:46:01.878657 1655502 ssh_runner.go:195] Run: rm -f paused
	I1124 08:46:01.883294 1655502 pod_ready.go:37] extra waiting up to 4m0s for all "kube-system" pods having one of [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] labels to be "Ready" ...
	I1124 08:46:01.887108 1655502 pod_ready.go:83] waiting for pod "coredns-66bc5c9577-z7rx4" in "kube-system" namespace to be "Ready" or be gone ...
	I1124 08:46:01.894403 1655502 pod_ready.go:94] pod "coredns-66bc5c9577-z7rx4" is "Ready"
	I1124 08:46:01.894439 1655502 pod_ready.go:86] duration metric: took 7.304278ms for pod "coredns-66bc5c9577-z7rx4" in "kube-system" namespace to be "Ready" or be gone ...
	I1124 08:46:01.897623 1655502 pod_ready.go:83] waiting for pod "etcd-addons-674149" in "kube-system" namespace to be "Ready" or be gone ...
	I1124 08:46:01.902488 1655502 pod_ready.go:94] pod "etcd-addons-674149" is "Ready"
	I1124 08:46:01.902561 1655502 pod_ready.go:86] duration metric: took 4.908901ms for pod "etcd-addons-674149" in "kube-system" namespace to be "Ready" or be gone ...
	I1124 08:46:01.904933 1655502 pod_ready.go:83] waiting for pod "kube-apiserver-addons-674149" in "kube-system" namespace to be "Ready" or be gone ...
	I1124 08:46:01.910678 1655502 pod_ready.go:94] pod "kube-apiserver-addons-674149" is "Ready"
	I1124 08:46:01.910707 1655502 pod_ready.go:86] duration metric: took 5.744886ms for pod "kube-apiserver-addons-674149" in "kube-system" namespace to be "Ready" or be gone ...
	I1124 08:46:01.913526 1655502 pod_ready.go:83] waiting for pod "kube-controller-manager-addons-674149" in "kube-system" namespace to be "Ready" or be gone ...
	I1124 08:46:02.288244 1655502 pod_ready.go:94] pod "kube-controller-manager-addons-674149" is "Ready"
	I1124 08:46:02.288279 1655502 pod_ready.go:86] duration metric: took 374.725384ms for pod "kube-controller-manager-addons-674149" in "kube-system" namespace to be "Ready" or be gone ...
	I1124 08:46:02.487976 1655502 pod_ready.go:83] waiting for pod "kube-proxy-jzklp" in "kube-system" namespace to be "Ready" or be gone ...
	I1124 08:46:02.887156 1655502 pod_ready.go:94] pod "kube-proxy-jzklp" is "Ready"
	I1124 08:46:02.887187 1655502 pod_ready.go:86] duration metric: took 399.187221ms for pod "kube-proxy-jzklp" in "kube-system" namespace to be "Ready" or be gone ...
	I1124 08:46:03.087833 1655502 pod_ready.go:83] waiting for pod "kube-scheduler-addons-674149" in "kube-system" namespace to be "Ready" or be gone ...
	I1124 08:46:03.486784 1655502 pod_ready.go:94] pod "kube-scheduler-addons-674149" is "Ready"
	I1124 08:46:03.486812 1655502 pod_ready.go:86] duration metric: took 398.952013ms for pod "kube-scheduler-addons-674149" in "kube-system" namespace to be "Ready" or be gone ...
	I1124 08:46:03.486826 1655502 pod_ready.go:40] duration metric: took 1.603493172s for extra waiting for all "kube-system" pods having one of [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] labels to be "Ready" ...
	I1124 08:46:03.539602 1655502 start.go:625] kubectl: 1.33.2, cluster: 1.34.2 (minor skew: 1)
	I1124 08:46:03.543299 1655502 out.go:179] * Done! kubectl is now configured to use "addons-674149" cluster and "default" namespace by default
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                      ATTEMPT             POD ID              POD                                       NAMESPACE
	1065e001e1694       ce2d2cda2d858       3 minutes ago       Running             hello-world-app           0                   01c6f014998db       hello-world-app-5d498dc89-g9c26           default
	6546e01fb23cb       cbad6347cca28       3 minutes ago       Running             nginx                     0                   d18ae611325e1       nginx                                     default
	95a358441b55f       1611cd07b61d5       4 minutes ago       Running             busybox                   0                   a761d3bce3506       busybox                                   default
	79ec3dda400eb       7ce2150c8929b       5 minutes ago       Running             local-path-provisioner    0                   0b041a9c2189a       local-path-provisioner-648f6765c9-76xdw   local-path-storage
	ce439678efef2       138784d87c9c5       5 minutes ago       Running             coredns                   0                   bca4c7c9ee9a7       coredns-66bc5c9577-z7rx4                  kube-system
	03141ec1a42aa       ba04bb24b9575       5 minutes ago       Running             storage-provisioner       0                   ad96d3c61f503       storage-provisioner                       kube-system
	b7dc458fbc9f4       94bff1bec29fd       6 minutes ago       Running             kube-proxy                0                   a8684a735c8b2       kube-proxy-jzklp                          kube-system
	e4fa1d980373a       b1a8c6f707935       6 minutes ago       Running             kindnet-cni               0                   554c609be1e9f       kindnet-r5bjv                             kube-system
	2a12d84eb6575       4f982e73e768a       6 minutes ago       Running             kube-scheduler            0                   a563e40f40422       kube-scheduler-addons-674149              kube-system
	d7c8cc0aaba10       2c5f0dedd21c2       6 minutes ago       Running             etcd                      0                   36f1080edfba4       etcd-addons-674149                        kube-system
	3e8aa15421719       b178af3d91f80       6 minutes ago       Running             kube-apiserver            0                   0cca17045be76       kube-apiserver-addons-674149              kube-system
	803b3ef862de5       1b34917560f09       6 minutes ago       Running             kube-controller-manager   0                   b90d1332adb84       kube-controller-manager-addons-674149     kube-system
	
	
	==> containerd <==
	Nov 24 08:50:44 addons-674149 containerd[750]: time="2025-11-24T08:50:44.168083911Z" level=info msg="container event discarded" container=0e2d245787270a956eb346e03824856f729cd4cc6c5bc4eb78dabfea52b0fbb0 type=CONTAINER_DELETED_EVENT
	Nov 24 08:50:44 addons-674149 containerd[750]: time="2025-11-24T08:50:44.220558052Z" level=info msg="container event discarded" container=a05f3ca807c08e1767293dc1989a6900e5fe06c2a3baa872e76071e12db8cbb3 type=CONTAINER_STOPPED_EVENT
	Nov 24 08:50:44 addons-674149 containerd[750]: time="2025-11-24T08:50:44.263966766Z" level=info msg="container event discarded" container=4bec745cbca20dcbebada6535956aed1c87c1fe4243d2fbfa671dc4c531cc3ef type=CONTAINER_CREATED_EVENT
	Nov 24 08:50:44 addons-674149 containerd[750]: time="2025-11-24T08:50:44.357309212Z" level=info msg="container event discarded" container=4bec745cbca20dcbebada6535956aed1c87c1fe4243d2fbfa671dc4c531cc3ef type=CONTAINER_STARTED_EVENT
	Nov 24 08:50:45 addons-674149 containerd[750]: time="2025-11-24T08:50:45.408568121Z" level=info msg="container event discarded" container=8b4d45504dc9e315bb55d044d06135e05a826eb8ad669a6029d7675c715320aa type=CONTAINER_STOPPED_EVENT
	Nov 24 08:50:45 addons-674149 containerd[750]: time="2025-11-24T08:50:45.579611187Z" level=info msg="container event discarded" container=a10061523d94955eb2f029aa623cc92a01f315c0a7eb173d909e49c1afe9ff86 type=CONTAINER_CREATED_EVENT
	Nov 24 08:50:45 addons-674149 containerd[750]: time="2025-11-24T08:50:45.792411063Z" level=info msg="container event discarded" container=a10061523d94955eb2f029aa623cc92a01f315c0a7eb173d909e49c1afe9ff86 type=CONTAINER_STARTED_EVENT
	Nov 24 08:50:46 addons-674149 containerd[750]: time="2025-11-24T08:50:46.383467664Z" level=info msg="container event discarded" container=cd5122116a14418e6efb62b278b895979c70a00f83e87425c9ed6196752881a0 type=CONTAINER_STOPPED_EVENT
	Nov 24 08:50:47 addons-674149 containerd[750]: time="2025-11-24T08:50:47.954682856Z" level=info msg="container event discarded" container=bb1f0dd23150502c257afabda1dee377c2207af7a355e7783f225b5503ae3723 type=CONTAINER_CREATED_EVENT
	Nov 24 08:50:48 addons-674149 containerd[750]: time="2025-11-24T08:50:48.043446770Z" level=info msg="container event discarded" container=bb1f0dd23150502c257afabda1dee377c2207af7a355e7783f225b5503ae3723 type=CONTAINER_STARTED_EVENT
	Nov 24 08:50:52 addons-674149 containerd[750]: time="2025-11-24T08:50:52.478497464Z" level=info msg="container event discarded" container=470deefe0a7c9f4f617db29e6da289d7703d989acf855ed5752c06296a7595da type=CONTAINER_CREATED_EVENT
	Nov 24 08:50:52 addons-674149 containerd[750]: time="2025-11-24T08:50:52.577755366Z" level=info msg="container event discarded" container=470deefe0a7c9f4f617db29e6da289d7703d989acf855ed5752c06296a7595da type=CONTAINER_STARTED_EVENT
	Nov 24 08:50:55 addons-674149 containerd[750]: time="2025-11-24T08:50:55.334714604Z" level=info msg="container event discarded" container=528907fbdc84d5dea4a277ae42149854183e919f81100ac8c6e63e198f07bd8c type=CONTAINER_CREATED_EVENT
	Nov 24 08:50:55 addons-674149 containerd[750]: time="2025-11-24T08:50:55.425053337Z" level=info msg="container event discarded" container=528907fbdc84d5dea4a277ae42149854183e919f81100ac8c6e63e198f07bd8c type=CONTAINER_STARTED_EVENT
	Nov 24 08:50:56 addons-674149 containerd[750]: time="2025-11-24T08:50:56.598760945Z" level=info msg="container event discarded" container=d80727fcc24a9a7d6347a5f0082c279ec74cfa5e0800550d1dc1c551a7f0933c type=CONTAINER_CREATED_EVENT
	Nov 24 08:50:56 addons-674149 containerd[750]: time="2025-11-24T08:50:56.692150049Z" level=info msg="container event discarded" container=d80727fcc24a9a7d6347a5f0082c279ec74cfa5e0800550d1dc1c551a7f0933c type=CONTAINER_STARTED_EVENT
	Nov 24 08:50:57 addons-674149 containerd[750]: time="2025-11-24T08:50:57.218796096Z" level=info msg="PullImage \"busybox:stable\""
	Nov 24 08:50:57 addons-674149 containerd[750]: time="2025-11-24T08:50:57.608381022Z" level=info msg="container event discarded" container=1016dc0dd1c99f0700f90b7bd83aeee4701b58074ca4fb9665253f6318a5b9e9 type=CONTAINER_CREATED_EVENT
	Nov 24 08:50:57 addons-674149 containerd[750]: time="2025-11-24T08:50:57.710948053Z" level=info msg="container event discarded" container=1016dc0dd1c99f0700f90b7bd83aeee4701b58074ca4fb9665253f6318a5b9e9 type=CONTAINER_STARTED_EVENT
	Nov 24 08:50:57 addons-674149 containerd[750]: time="2025-11-24T08:50:57.756699835Z" level=error msg="PullImage \"busybox:stable\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"docker.io/library/busybox:stable\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/busybox/manifests/sha256:079b4a73854a059a2073c6e1a031b17fcbf23a47c6c59ae760d78045199e403c: 429 Too Many Requests\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit"
	Nov 24 08:50:57 addons-674149 containerd[750]: time="2025-11-24T08:50:57.756725993Z" level=info msg="stop pulling image docker.io/library/busybox:stable: active requests=0, bytes read=20514"
	Nov 24 08:50:59 addons-674149 containerd[750]: time="2025-11-24T08:50:59.124080182Z" level=info msg="container event discarded" container=09a53d598c28e4ecd8941614403ac3a31c1b156f03b6ced2bd556bc14343069b type=CONTAINER_CREATED_EVENT
	Nov 24 08:50:59 addons-674149 containerd[750]: time="2025-11-24T08:50:59.283019336Z" level=info msg="container event discarded" container=09a53d598c28e4ecd8941614403ac3a31c1b156f03b6ced2bd556bc14343069b type=CONTAINER_STARTED_EVENT
	Nov 24 08:51:00 addons-674149 containerd[750]: time="2025-11-24T08:51:00.842349372Z" level=info msg="container event discarded" container=9f20f6ec44d3425ac305ebb92473972886c2dc10005b69cefea3d6f7102c802a type=CONTAINER_CREATED_EVENT
	Nov 24 08:51:00 addons-674149 containerd[750]: time="2025-11-24T08:51:00.997241202Z" level=info msg="container event discarded" container=9f20f6ec44d3425ac305ebb92473972886c2dc10005b69cefea3d6f7102c802a type=CONTAINER_STARTED_EVENT
	
	
	==> coredns [ce439678efef28ba1791542b9940ae503c1a72c533b69d2df95c402e894fe7ea] <==
	[INFO] 10.244.0.24:42361 - 57823 "A IN hello-world-app.default.svc.cluster.local. udp 59 false 512" NOERROR qr,aa,rd 116 0.000105273s
	[INFO] 10.244.0.24:55949 - 48858 "AAAA IN hello-world-app.default.svc.cluster.local.svc.cluster.local. udp 77 false 512" NXDOMAIN qr,aa,rd 170 0.000089093s
	[INFO] 10.244.0.24:55949 - 37046 "A IN hello-world-app.default.svc.cluster.local.cluster.local. udp 73 false 512" NXDOMAIN qr,aa,rd 166 0.000183379s
	[INFO] 10.244.0.24:55949 - 62821 "AAAA IN hello-world-app.default.svc.cluster.local.cluster.local. udp 73 false 512" NXDOMAIN qr,aa,rd 166 0.000097109s
	[INFO] 10.244.0.24:55949 - 36179 "A IN hello-world-app.default.svc.cluster.local.us-east-2.compute.internal. udp 86 false 512" NXDOMAIN qr,rd,ra 86 0.00493215s
	[INFO] 10.244.0.24:55949 - 2435 "AAAA IN hello-world-app.default.svc.cluster.local.us-east-2.compute.internal. udp 86 false 512" NXDOMAIN qr,rd,ra 86 0.001738101s
	[INFO] 10.244.0.24:55949 - 61773 "A IN hello-world-app.default.svc.cluster.local. udp 59 false 512" NOERROR qr,aa,rd 116 0.000314113s
	[INFO] 10.244.0.24:46697 - 16878 "A IN hello-world-app.default.svc.cluster.local.ingress-nginx.svc.cluster.local. udp 91 false 512" NXDOMAIN qr,aa,rd 184 0.000164843s
	[INFO] 10.244.0.24:46728 - 5104 "A IN hello-world-app.default.svc.cluster.local.ingress-nginx.svc.cluster.local. udp 91 false 512" NXDOMAIN qr,aa,rd 184 0.000094852s
	[INFO] 10.244.0.24:46728 - 14778 "AAAA IN hello-world-app.default.svc.cluster.local.ingress-nginx.svc.cluster.local. udp 91 false 512" NXDOMAIN qr,aa,rd 184 0.000107932s
	[INFO] 10.244.0.24:46697 - 564 "AAAA IN hello-world-app.default.svc.cluster.local.ingress-nginx.svc.cluster.local. udp 91 false 512" NXDOMAIN qr,aa,rd 184 0.000071869s
	[INFO] 10.244.0.24:46728 - 18351 "A IN hello-world-app.default.svc.cluster.local.svc.cluster.local. udp 77 false 512" NXDOMAIN qr,aa,rd 170 0.000102081s
	[INFO] 10.244.0.24:46697 - 57226 "A IN hello-world-app.default.svc.cluster.local.svc.cluster.local. udp 77 false 512" NXDOMAIN qr,aa,rd 170 0.000063114s
	[INFO] 10.244.0.24:46728 - 27183 "AAAA IN hello-world-app.default.svc.cluster.local.svc.cluster.local. udp 77 false 512" NXDOMAIN qr,aa,rd 170 0.000085729s
	[INFO] 10.244.0.24:46697 - 56816 "AAAA IN hello-world-app.default.svc.cluster.local.svc.cluster.local. udp 77 false 512" NXDOMAIN qr,aa,rd 170 0.000076998s
	[INFO] 10.244.0.24:46728 - 4601 "A IN hello-world-app.default.svc.cluster.local.cluster.local. udp 73 false 512" NXDOMAIN qr,aa,rd 166 0.000095525s
	[INFO] 10.244.0.24:46697 - 38167 "A IN hello-world-app.default.svc.cluster.local.cluster.local. udp 73 false 512" NXDOMAIN qr,aa,rd 166 0.000096969s
	[INFO] 10.244.0.24:46728 - 40394 "AAAA IN hello-world-app.default.svc.cluster.local.cluster.local. udp 73 false 512" NXDOMAIN qr,aa,rd 166 0.000085876s
	[INFO] 10.244.0.24:46697 - 59398 "AAAA IN hello-world-app.default.svc.cluster.local.cluster.local. udp 73 false 512" NXDOMAIN qr,aa,rd 166 0.00091404s
	[INFO] 10.244.0.24:46728 - 512 "A IN hello-world-app.default.svc.cluster.local.us-east-2.compute.internal. udp 86 false 512" NXDOMAIN qr,rd,ra 86 0.001466385s
	[INFO] 10.244.0.24:46697 - 27243 "A IN hello-world-app.default.svc.cluster.local.us-east-2.compute.internal. udp 86 false 512" NXDOMAIN qr,rd,ra 86 0.001462085s
	[INFO] 10.244.0.24:46728 - 19869 "AAAA IN hello-world-app.default.svc.cluster.local.us-east-2.compute.internal. udp 86 false 512" NXDOMAIN qr,rd,ra 86 0.001286468s
	[INFO] 10.244.0.24:46728 - 61859 "A IN hello-world-app.default.svc.cluster.local. udp 59 false 512" NOERROR qr,aa,rd 116 0.000129905s
	[INFO] 10.244.0.24:46697 - 60802 "AAAA IN hello-world-app.default.svc.cluster.local.us-east-2.compute.internal. udp 86 false 512" NXDOMAIN qr,rd,ra 86 0.001103877s
	[INFO] 10.244.0.24:46697 - 43315 "A IN hello-world-app.default.svc.cluster.local. udp 59 false 512" NOERROR qr,aa,rd 116 0.000109081s
	
	
	==> describe nodes <==
	Name:               addons-674149
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=arm64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=arm64
	                    kubernetes.io/hostname=addons-674149
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=393ee3e0b845623107dce6cda4f48ffd5c3d1811
	                    minikube.k8s.io/name=addons-674149
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2025_11_24T08_44_20_0700
	                    minikube.k8s.io/version=v1.37.0
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	                    topology.hostpath.csi/node=addons-674149
	Annotations:        node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Mon, 24 Nov 2025 08:44:16 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  addons-674149
	  AcquireTime:     <unset>
	  RenewTime:       Mon, 24 Nov 2025 08:50:57 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Mon, 24 Nov 2025 08:48:25 +0000   Mon, 24 Nov 2025 08:44:13 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Mon, 24 Nov 2025 08:48:25 +0000   Mon, 24 Nov 2025 08:44:13 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Mon, 24 Nov 2025 08:48:25 +0000   Mon, 24 Nov 2025 08:44:13 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Mon, 24 Nov 2025 08:48:25 +0000   Mon, 24 Nov 2025 08:45:06 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.49.2
	  Hostname:    addons-674149
	Capacity:
	  cpu:                2
	  ephemeral-storage:  203034800Ki
	  hugepages-1Gi:      0
	  hugepages-2Mi:      0
	  hugepages-32Mi:     0
	  hugepages-64Ki:     0
	  memory:             8022296Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  203034800Ki
	  hugepages-1Gi:      0
	  hugepages-2Mi:      0
	  hugepages-32Mi:     0
	  hugepages-64Ki:     0
	  memory:             8022296Ki
	  pods:               110
	System Info:
	  Machine ID:                 7283ea1857f18f20a875c29069214c9d
	  System UUID:                75b2f83f-e680-4bff-b86d-e414ad07778f
	  Boot ID:                    e6ca431c-3a35-478f-87f6-f49cc4bc8a65
	  Kernel Version:             5.15.0-1084-aws
	  OS Image:                   Debian GNU/Linux 12 (bookworm)
	  Operating System:           linux
	  Architecture:               arm64
	  Container Runtime Version:  containerd://2.1.5
	  Kubelet Version:            v1.34.2
	  Kube-Proxy Version:         
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (13 in total)
	  Namespace                   Name                                       CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                       ------------  ----------  ---------------  -------------  ---
	  default                     busybox                                    0 (0%)        0 (0%)      0 (0%)           0 (0%)         4m19s
	  default                     hello-world-app-5d498dc89-g9c26            0 (0%)        0 (0%)      0 (0%)           0 (0%)         3m16s
	  default                     nginx                                      0 (0%)        0 (0%)      0 (0%)           0 (0%)         3m25s
	  default                     test-local-path                            0 (0%)        0 (0%)      0 (0%)           0 (0%)         3m7s
	  kube-system                 coredns-66bc5c9577-z7rx4                   100m (5%)     0 (0%)      70Mi (0%)        170Mi (2%)     6m40s
	  kube-system                 etcd-addons-674149                         100m (5%)     0 (0%)      100Mi (1%)       0 (0%)         6m45s
	  kube-system                 kindnet-r5bjv                              100m (5%)     100m (5%)   50Mi (0%)        50Mi (0%)      6m40s
	  kube-system                 kube-apiserver-addons-674149               250m (12%)    0 (0%)      0 (0%)           0 (0%)         6m46s
	  kube-system                 kube-controller-manager-addons-674149      200m (10%)    0 (0%)      0 (0%)           0 (0%)         6m45s
	  kube-system                 kube-proxy-jzklp                           0 (0%)        0 (0%)      0 (0%)           0 (0%)         6m40s
	  kube-system                 kube-scheduler-addons-674149               100m (5%)     0 (0%)      0 (0%)           0 (0%)         6m47s
	  kube-system                 storage-provisioner                        0 (0%)        0 (0%)      0 (0%)           0 (0%)         6m35s
	  local-path-storage          local-path-provisioner-648f6765c9-76xdw    0 (0%)        0 (0%)      0 (0%)           0 (0%)         6m37s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                850m (42%)  100m (5%)
	  memory             220Mi (2%)  220Mi (2%)
	  ephemeral-storage  0 (0%)      0 (0%)
	  hugepages-1Gi      0 (0%)      0 (0%)
	  hugepages-2Mi      0 (0%)      0 (0%)
	  hugepages-32Mi     0 (0%)      0 (0%)
	  hugepages-64Ki     0 (0%)      0 (0%)
	Events:
	  Type     Reason                   Age                    From             Message
	  ----     ------                   ----                   ----             -------
	  Normal   Starting                 6m37s                  kube-proxy       
	  Normal   NodeAllocatableEnforced  6m53s                  kubelet          Updated Node Allocatable limit across pods
	  Warning  CgroupV1                 6m53s                  kubelet          cgroup v1 support is in maintenance mode, please migrate to cgroup v2
	  Normal   NodeHasSufficientMemory  6m53s (x8 over 6m53s)  kubelet          Node addons-674149 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    6m53s (x8 over 6m53s)  kubelet          Node addons-674149 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     6m53s (x7 over 6m53s)  kubelet          Node addons-674149 status is now: NodeHasSufficientPID
	  Normal   Starting                 6m53s                  kubelet          Starting kubelet.
	  Warning  CgroupV1                 6m45s                  kubelet          cgroup v1 support is in maintenance mode, please migrate to cgroup v2
	  Normal   Starting                 6m45s                  kubelet          Starting kubelet.
	  Normal   NodeAllocatableEnforced  6m45s                  kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientMemory  6m45s                  kubelet          Node addons-674149 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    6m45s                  kubelet          Node addons-674149 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     6m45s                  kubelet          Node addons-674149 status is now: NodeHasSufficientPID
	  Normal   RegisteredNode           6m41s                  node-controller  Node addons-674149 event: Registered Node addons-674149 in Controller
	  Normal   NodeReady                5m58s                  kubelet          Node addons-674149 status is now: NodeReady
	
	
	==> dmesg <==
	[Nov24 08:20] overlayfs: idmapped layers are currently not supported
	[Nov24 08:21] overlayfs: idmapped layers are currently not supported
	[Nov24 08:22] overlayfs: idmapped layers are currently not supported
	[Nov24 08:23] overlayfs: idmapped layers are currently not supported
	[Nov24 08:24] overlayfs: idmapped layers are currently not supported
	[ +19.566509] overlayfs: idmapped layers are currently not supported
	[Nov24 08:25] overlayfs: idmapped layers are currently not supported
	[ +35.934095] overlayfs: idmapped layers are currently not supported
	[Nov24 08:27] overlayfs: idmapped layers are currently not supported
	[Nov24 08:28] overlayfs: idmapped layers are currently not supported
	[Nov24 08:29] overlayfs: idmapped layers are currently not supported
	[Nov24 08:30] overlayfs: idmapped layers are currently not supported
	[Nov24 08:31] overlayfs: idmapped layers are currently not supported
	[ +44.505180] overlayfs: idmapped layers are currently not supported
	[Nov24 08:32] overlayfs: idmapped layers are currently not supported
	[Nov24 08:33] overlayfs: idmapped layers are currently not supported
	[Nov24 08:34] overlayfs: idmapped layers are currently not supported
	[ +21.137877] overlayfs: idmapped layers are currently not supported
	[ +28.724175] overlayfs: idmapped layers are currently not supported
	[Nov24 08:36] overlayfs: idmapped layers are currently not supported
	[Nov24 08:37] overlayfs: idmapped layers are currently not supported
	[Nov24 08:38] overlayfs: idmapped layers are currently not supported
	[  +4.063834] overlayfs: idmapped layers are currently not supported
	[Nov24 08:40] overlayfs: idmapped layers are currently not supported
	[Nov24 08:42] kauditd_printk_skb: 8 callbacks suppressed
	
	
	==> etcd [d7c8cc0aaba10152dbbb131910f889616c68f2ce2e26274f4a59a08a2632d9a7] <==
	{"level":"warn","ts":"2025-11-24T08:44:15.492548Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:55700","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:44:15.511510Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:55684","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:44:15.524180Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:55726","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:44:15.553096Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:55744","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:44:15.591178Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:55758","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:44:15.608570Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:55784","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:44:15.639836Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:55798","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:44:15.673051Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:55810","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:44:15.700310Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:55834","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:44:15.795163Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:55840","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:44:34.172528Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:39096","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:44:34.182016Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:39118","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:44:53.565347Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:33354","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:44:53.586572Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:33360","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:44:53.604016Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:33374","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:44:53.619684Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:33392","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:44:53.637354Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:33418","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:44:53.673275Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:33426","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:44:53.707136Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:33452","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:44:53.774600Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:33472","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:44:53.814977Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:33484","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:44:53.844610Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:33504","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:44:53.876486Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:33518","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:44:53.891221Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:33544","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:44:53.906498Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:33562","server-name":"","error":"EOF"}
	
	
	==> kernel <==
	 08:51:04 up  7:33,  0 user,  load average: 0.71, 1.21, 2.15
	Linux addons-674149 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kindnet [e4fa1d980373a22a53530397747bc64693a3bea7675ada03b1b8f0912049aebe] <==
	I1124 08:48:56.128548       1 main.go:301] handling current node
	I1124 08:49:06.123787       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 08:49:06.123824       1 main.go:301] handling current node
	I1124 08:49:16.120273       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 08:49:16.120371       1 main.go:301] handling current node
	I1124 08:49:26.128503       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 08:49:26.128611       1 main.go:301] handling current node
	I1124 08:49:36.119821       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 08:49:36.119916       1 main.go:301] handling current node
	I1124 08:49:46.120847       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 08:49:46.120884       1 main.go:301] handling current node
	I1124 08:49:56.119813       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 08:49:56.119853       1 main.go:301] handling current node
	I1124 08:50:06.120087       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 08:50:06.120127       1 main.go:301] handling current node
	I1124 08:50:16.125224       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 08:50:16.125262       1 main.go:301] handling current node
	I1124 08:50:26.120027       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 08:50:26.120167       1 main.go:301] handling current node
	I1124 08:50:36.127581       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 08:50:36.127620       1 main.go:301] handling current node
	I1124 08:50:46.119802       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 08:50:46.119834       1 main.go:301] handling current node
	I1124 08:50:56.127056       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 08:50:56.127093       1 main.go:301] handling current node
	
	
	==> kube-apiserver [3e8aa15421719b446531e1864354eaddee258990084731c567bc081f9a50edec] <==
	I1124 08:46:36.488206       1 handler.go:285] Adding GroupVersion flow.volcano.sh v1alpha1 to ResourceManager
	W1124 08:46:36.488734       1 cacher.go:182] Terminating all watchers from cacher podgroups.scheduling.volcano.sh
	W1124 08:46:36.514587       1 cacher.go:182] Terminating all watchers from cacher hypernodes.topology.volcano.sh
	W1124 08:46:36.535261       1 cacher.go:182] Terminating all watchers from cacher numatopologies.nodeinfo.volcano.sh
	W1124 08:46:36.573208       1 cacher.go:182] Terminating all watchers from cacher queues.scheduling.volcano.sh
	W1124 08:46:37.488306       1 cacher.go:182] Terminating all watchers from cacher jobtemplates.flow.volcano.sh
	W1124 08:46:37.538037       1 cacher.go:182] Terminating all watchers from cacher jobflows.flow.volcano.sh
	E1124 08:46:54.048779       1 conn.go:339] Error on socket receive: read tcp 192.168.49.2:8443->192.168.49.1:51314: use of closed network connection
	I1124 08:47:04.636114       1 alloc.go:328] "allocated clusterIPs" service="headlamp/headlamp" clusterIPs={"IPv4":"10.99.49.166"}
	I1124 08:47:31.495897       1 controller.go:667] quota admission added evaluator for: volumesnapshots.snapshot.storage.k8s.io
	I1124 08:47:31.514419       1 controller.go:129] OpenAPI AggregationController: action for item v1beta1.metrics.k8s.io: Nothing (removed from the queue).
	I1124 08:47:39.030387       1 controller.go:667] quota admission added evaluator for: ingresses.networking.k8s.io
	I1124 08:47:39.379245       1 alloc.go:328] "allocated clusterIPs" service="default/nginx" clusterIPs={"IPv4":"10.105.221.152"}
	I1124 08:47:47.941470       1 handler.go:285] Adding GroupVersion snapshot.storage.k8s.io v1 to ResourceManager
	I1124 08:47:47.941524       1 handler.go:285] Adding GroupVersion snapshot.storage.k8s.io v1beta1 to ResourceManager
	I1124 08:47:47.975146       1 handler.go:285] Adding GroupVersion snapshot.storage.k8s.io v1 to ResourceManager
	I1124 08:47:47.975197       1 handler.go:285] Adding GroupVersion snapshot.storage.k8s.io v1beta1 to ResourceManager
	I1124 08:47:48.026057       1 handler.go:285] Adding GroupVersion snapshot.storage.k8s.io v1 to ResourceManager
	I1124 08:47:48.026126       1 handler.go:285] Adding GroupVersion snapshot.storage.k8s.io v1beta1 to ResourceManager
	I1124 08:47:48.069495       1 handler.go:285] Adding GroupVersion snapshot.storage.k8s.io v1 to ResourceManager
	I1124 08:47:48.069551       1 handler.go:285] Adding GroupVersion snapshot.storage.k8s.io v1beta1 to ResourceManager
	I1124 08:47:48.356742       1 alloc.go:328] "allocated clusterIPs" service="default/hello-world-app" clusterIPs={"IPv4":"10.101.172.173"}
	W1124 08:47:49.031562       1 cacher.go:182] Terminating all watchers from cacher volumesnapshotclasses.snapshot.storage.k8s.io
	W1124 08:47:49.069700       1 cacher.go:182] Terminating all watchers from cacher volumesnapshotcontents.snapshot.storage.k8s.io
	W1124 08:47:49.134544       1 cacher.go:182] Terminating all watchers from cacher volumesnapshots.snapshot.storage.k8s.io
	
	
	==> kube-controller-manager [803b3ef862de5614b9cf06db105b21ca084eeefa3a8f155ad6cf27f24829f74f] <==
	E1124 08:50:22.361015       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PartialObjectMetadata: the server could not find the requested resource" logger="UnhandledError" reflector="k8s.io/client-go/metadata/metadatainformer/informer.go:138" type="*v1.PartialObjectMetadata"
	E1124 08:50:28.051376       1 reflector.go:422] "The watchlist request ended with an error, falling back to the standard LIST/WATCH semantics because making progress is better than deadlocking" err="the server could not find the requested resource"
	E1124 08:50:28.052856       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PartialObjectMetadata: the server could not find the requested resource" logger="UnhandledError" reflector="k8s.io/client-go/metadata/metadatainformer/informer.go:138" type="*v1.PartialObjectMetadata"
	E1124 08:50:30.803211       1 reflector.go:422] "The watchlist request ended with an error, falling back to the standard LIST/WATCH semantics because making progress is better than deadlocking" err="the server could not find the requested resource"
	E1124 08:50:30.804497       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PartialObjectMetadata: the server could not find the requested resource" logger="UnhandledError" reflector="k8s.io/client-go/metadata/metadatainformer/informer.go:138" type="*v1.PartialObjectMetadata"
	E1124 08:50:31.798315       1 reflector.go:422] "The watchlist request ended with an error, falling back to the standard LIST/WATCH semantics because making progress is better than deadlocking" err="the server could not find the requested resource"
	E1124 08:50:31.799547       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PartialObjectMetadata: the server could not find the requested resource" logger="UnhandledError" reflector="k8s.io/client-go/metadata/metadatainformer/informer.go:138" type="*v1.PartialObjectMetadata"
	E1124 08:50:33.486184       1 reflector.go:422] "The watchlist request ended with an error, falling back to the standard LIST/WATCH semantics because making progress is better than deadlocking" err="the server could not find the requested resource"
	E1124 08:50:33.487496       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PartialObjectMetadata: the server could not find the requested resource" logger="UnhandledError" reflector="k8s.io/client-go/metadata/metadatainformer/informer.go:138" type="*v1.PartialObjectMetadata"
	E1124 08:50:44.616355       1 reflector.go:422] "The watchlist request ended with an error, falling back to the standard LIST/WATCH semantics because making progress is better than deadlocking" err="the server could not find the requested resource"
	E1124 08:50:44.617823       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PartialObjectMetadata: the server could not find the requested resource" logger="UnhandledError" reflector="k8s.io/client-go/metadata/metadatainformer/informer.go:138" type="*v1.PartialObjectMetadata"
	E1124 08:50:50.838084       1 reflector.go:422] "The watchlist request ended with an error, falling back to the standard LIST/WATCH semantics because making progress is better than deadlocking" err="the server could not find the requested resource"
	E1124 08:50:50.839369       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PartialObjectMetadata: the server could not find the requested resource" logger="UnhandledError" reflector="k8s.io/client-go/metadata/metadatainformer/informer.go:138" type="*v1.PartialObjectMetadata"
	E1124 08:50:52.825567       1 reflector.go:422] "The watchlist request ended with an error, falling back to the standard LIST/WATCH semantics because making progress is better than deadlocking" err="the server could not find the requested resource"
	E1124 08:50:52.826937       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PartialObjectMetadata: the server could not find the requested resource" logger="UnhandledError" reflector="k8s.io/client-go/metadata/metadatainformer/informer.go:138" type="*v1.PartialObjectMetadata"
	E1124 08:50:56.044627       1 reflector.go:422] "The watchlist request ended with an error, falling back to the standard LIST/WATCH semantics because making progress is better than deadlocking" err="the server could not find the requested resource"
	E1124 08:50:56.045854       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PartialObjectMetadata: the server could not find the requested resource" logger="UnhandledError" reflector="k8s.io/client-go/metadata/metadatainformer/informer.go:138" type="*v1.PartialObjectMetadata"
	E1124 08:50:56.170256       1 reflector.go:422] "The watchlist request ended with an error, falling back to the standard LIST/WATCH semantics because making progress is better than deadlocking" err="the server could not find the requested resource"
	E1124 08:50:56.171580       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PartialObjectMetadata: the server could not find the requested resource" logger="UnhandledError" reflector="k8s.io/client-go/metadata/metadatainformer/informer.go:138" type="*v1.PartialObjectMetadata"
	E1124 08:50:57.344199       1 reflector.go:422] "The watchlist request ended with an error, falling back to the standard LIST/WATCH semantics because making progress is better than deadlocking" err="the server could not find the requested resource"
	E1124 08:50:57.345308       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PartialObjectMetadata: the server could not find the requested resource" logger="UnhandledError" reflector="k8s.io/client-go/metadata/metadatainformer/informer.go:138" type="*v1.PartialObjectMetadata"
	E1124 08:50:57.768080       1 reflector.go:422] "The watchlist request ended with an error, falling back to the standard LIST/WATCH semantics because making progress is better than deadlocking" err="the server could not find the requested resource"
	E1124 08:50:57.769090       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PartialObjectMetadata: the server could not find the requested resource" logger="UnhandledError" reflector="k8s.io/client-go/metadata/metadatainformer/informer.go:138" type="*v1.PartialObjectMetadata"
	E1124 08:51:02.293400       1 reflector.go:422] "The watchlist request ended with an error, falling back to the standard LIST/WATCH semantics because making progress is better than deadlocking" err="the server could not find the requested resource"
	E1124 08:51:02.294695       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PartialObjectMetadata: the server could not find the requested resource" logger="UnhandledError" reflector="k8s.io/client-go/metadata/metadatainformer/informer.go:138" type="*v1.PartialObjectMetadata"
	
	
	==> kube-proxy [b7dc458fbc9f4310377a3cffeff8089be0942289ffb29471015f3b29b1b15a76] <==
	I1124 08:44:25.982723       1 server_linux.go:53] "Using iptables proxy"
	I1124 08:44:26.060307       1 shared_informer.go:349] "Waiting for caches to sync" controller="node informer cache"
	I1124 08:44:26.160756       1 shared_informer.go:356] "Caches are synced" controller="node informer cache"
	I1124 08:44:26.160801       1 server.go:219] "Successfully retrieved NodeIPs" NodeIPs=["192.168.49.2"]
	E1124 08:44:26.160889       1 server.go:256] "Kube-proxy configuration may be incomplete or incorrect" err="nodePortAddresses is unset; NodePort connections will be accepted on all local IPs. Consider using `--nodeport-addresses primary`"
	I1124 08:44:26.186599       1 server.go:265] "kube-proxy running in dual-stack mode" primary ipFamily="IPv4"
	I1124 08:44:26.186650       1 server_linux.go:132] "Using iptables Proxier"
	I1124 08:44:26.190432       1 proxier.go:242] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses" ipFamily="IPv4"
	I1124 08:44:26.190850       1 server.go:527] "Version info" version="v1.34.2"
	I1124 08:44:26.190871       1 server.go:529] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I1124 08:44:26.192165       1 config.go:200] "Starting service config controller"
	I1124 08:44:26.192189       1 shared_informer.go:349] "Waiting for caches to sync" controller="service config"
	I1124 08:44:26.192207       1 config.go:106] "Starting endpoint slice config controller"
	I1124 08:44:26.192211       1 shared_informer.go:349] "Waiting for caches to sync" controller="endpoint slice config"
	I1124 08:44:26.192222       1 config.go:403] "Starting serviceCIDR config controller"
	I1124 08:44:26.192231       1 shared_informer.go:349] "Waiting for caches to sync" controller="serviceCIDR config"
	I1124 08:44:26.195804       1 config.go:309] "Starting node config controller"
	I1124 08:44:26.195830       1 shared_informer.go:349] "Waiting for caches to sync" controller="node config"
	I1124 08:44:26.195839       1 shared_informer.go:356] "Caches are synced" controller="node config"
	I1124 08:44:26.293000       1 shared_informer.go:356] "Caches are synced" controller="serviceCIDR config"
	I1124 08:44:26.293038       1 shared_informer.go:356] "Caches are synced" controller="service config"
	I1124 08:44:26.293077       1 shared_informer.go:356] "Caches are synced" controller="endpoint slice config"
	
	
	==> kube-scheduler [2a12d84eb6575a659f4580014afc906d613030c7af1d3e2e6670c224557216c6] <==
	E1124 08:44:16.542890       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: nodes is forbidden: User \"system:kube-scheduler\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1124 08:44:16.543115       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User \"system:kube-scheduler\" cannot list resource \"persistentvolumeclaims\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolumeClaim"
	E1124 08:44:16.543183       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:kube-scheduler\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service"
	E1124 08:44:16.543380       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User \"system:kube-scheduler\" cannot list resource \"replicasets\" in API group \"apps\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ReplicaSet"
	E1124 08:44:16.543735       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIStorageCapacity: csistoragecapacities.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csistoragecapacities\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIStorageCapacity"
	E1124 08:44:16.543886       1 reflector.go:205] "Failed to watch" err="failed to list *v1.VolumeAttachment: volumeattachments.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"volumeattachments\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.VolumeAttachment"
	E1124 08:44:16.544185       1 reflector.go:205] "Failed to watch" err="failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"storageclasses\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.StorageClass"
	E1124 08:44:16.544504       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ResourceSlice: resourceslices.resource.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"resourceslices\" in API group \"resource.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ResourceSlice"
	E1124 08:44:16.544544       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Namespace: namespaces is forbidden: User \"system:kube-scheduler\" cannot list resource \"namespaces\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Namespace"
	E1124 08:44:17.352984       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ResourceClaim: resourceclaims.resource.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"resourceclaims\" in API group \"resource.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ResourceClaim"
	E1124 08:44:17.465655       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User \"system:kube-scheduler\" cannot list resource \"replicasets\" in API group \"apps\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ReplicaSet"
	E1124 08:44:17.469155       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver"
	E1124 08:44:17.542891       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User \"system:kube-scheduler\" cannot list resource \"persistentvolumes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolume"
	E1124 08:44:17.661778       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIStorageCapacity: csistoragecapacities.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csistoragecapacities\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIStorageCapacity"
	E1124 08:44:17.712714       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User \"system:kube-scheduler\" cannot list resource \"replicationcontrollers\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ReplicationController"
	E1124 08:44:17.787981       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ResourceSlice: resourceslices.resource.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"resourceslices\" in API group \"resource.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ResourceSlice"
	E1124 08:44:17.798268       1 reflector.go:205] "Failed to watch" err="failed to list *v1.VolumeAttachment: volumeattachments.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"volumeattachments\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.VolumeAttachment"
	E1124 08:44:17.810208       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User \"system:kube-scheduler\" cannot list resource \"persistentvolumeclaims\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolumeClaim"
	E1124 08:44:17.815176       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Pod: pods is forbidden: User \"system:kube-scheduler\" cannot list resource \"pods\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Pod"
	E1124 08:44:17.822626       1 reflector.go:205] "Failed to watch" err="failed to list *v1.DeviceClass: deviceclasses.resource.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"deviceclasses\" in API group \"resource.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.DeviceClass"
	E1124 08:44:17.833935       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csinodes\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSINode"
	E1124 08:44:17.844872       1 reflector.go:205] "Failed to watch" err="failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"storageclasses\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.StorageClass"
	E1124 08:44:17.845759       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Namespace: namespaces is forbidden: User \"system:kube-scheduler\" cannot list resource \"namespaces\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Namespace"
	E1124 08:44:18.138845       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"extension-apiserver-authentication\" is forbidden: User \"system:kube-scheduler\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"kube-system\"" logger="UnhandledError" reflector="runtime/asm_arm64.s:1223" type="*v1.ConfigMap"
	I1124 08:44:20.624871       1 shared_informer.go:356] "Caches are synced" controller="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	
	
	==> kubelet <==
	Nov 24 08:49:27 addons-674149 kubelet[1447]: E1124 08:49:27.643619    1447 kuberuntime_manager.go:1449] "Unhandled Error" err=<
	Nov 24 08:49:27 addons-674149 kubelet[1447]:         container busybox start failed in pod test-local-path_default(db57d505-80a3-4fb4-b2fe-b9a4b8812a33): ErrImagePull: failed to pull and unpack image "docker.io/library/busybox:stable": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/busybox/manifests/sha256:355b3a1bf5609da364166913878a8508d4ba30572d02020a97028c75477e24ff: 429 Too Many Requests
	Nov 24 08:49:27 addons-674149 kubelet[1447]:         toomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit
	Nov 24 08:49:27 addons-674149 kubelet[1447]:  > logger="UnhandledError"
	Nov 24 08:49:27 addons-674149 kubelet[1447]: E1124 08:49:27.643657    1447 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"busybox\" with ErrImagePull: \"failed to pull and unpack image \\\"docker.io/library/busybox:stable\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/busybox/manifests/sha256:355b3a1bf5609da364166913878a8508d4ba30572d02020a97028c75477e24ff: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/test-local-path" podUID="db57d505-80a3-4fb4-b2fe-b9a4b8812a33"
	Nov 24 08:49:40 addons-674149 kubelet[1447]: E1124 08:49:40.218028    1447 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"busybox\" with ImagePullBackOff: \"Back-off pulling image \\\"busybox:stable\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/library/busybox:stable\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/busybox/manifests/sha256:355b3a1bf5609da364166913878a8508d4ba30572d02020a97028c75477e24ff: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/test-local-path" podUID="db57d505-80a3-4fb4-b2fe-b9a4b8812a33"
	Nov 24 08:49:54 addons-674149 kubelet[1447]: E1124 08:49:54.217352    1447 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"busybox\" with ImagePullBackOff: \"Back-off pulling image \\\"busybox:stable\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/library/busybox:stable\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/busybox/manifests/sha256:355b3a1bf5609da364166913878a8508d4ba30572d02020a97028c75477e24ff: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/test-local-path" podUID="db57d505-80a3-4fb4-b2fe-b9a4b8812a33"
	Nov 24 08:50:09 addons-674149 kubelet[1447]: E1124 08:50:09.217921    1447 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"busybox\" with ImagePullBackOff: \"Back-off pulling image \\\"busybox:stable\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/library/busybox:stable\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/busybox/manifests/sha256:355b3a1bf5609da364166913878a8508d4ba30572d02020a97028c75477e24ff: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/test-local-path" podUID="db57d505-80a3-4fb4-b2fe-b9a4b8812a33"
	Nov 24 08:50:22 addons-674149 kubelet[1447]: E1124 08:50:22.217446    1447 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"busybox\" with ImagePullBackOff: \"Back-off pulling image \\\"busybox:stable\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/library/busybox:stable\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/busybox/manifests/sha256:355b3a1bf5609da364166913878a8508d4ba30572d02020a97028c75477e24ff: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/test-local-path" podUID="db57d505-80a3-4fb4-b2fe-b9a4b8812a33"
	Nov 24 08:50:34 addons-674149 kubelet[1447]: E1124 08:50:34.218519    1447 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"busybox\" with ImagePullBackOff: \"Back-off pulling image \\\"busybox:stable\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/library/busybox:stable\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/busybox/manifests/sha256:355b3a1bf5609da364166913878a8508d4ba30572d02020a97028c75477e24ff: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/test-local-path" podUID="db57d505-80a3-4fb4-b2fe-b9a4b8812a33"
	Nov 24 08:50:38 addons-674149 kubelet[1447]: I1124 08:50:38.216575    1447 kubelet_pods.go:1082] "Unable to retrieve pull secret, the image pull may not succeed." pod="default/busybox" secret="" err="secret \"gcp-auth\" not found"
	Nov 24 08:50:45 addons-674149 kubelet[1447]: E1124 08:50:45.217904    1447 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"busybox\" with ImagePullBackOff: \"Back-off pulling image \\\"busybox:stable\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/library/busybox:stable\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/busybox/manifests/sha256:355b3a1bf5609da364166913878a8508d4ba30572d02020a97028c75477e24ff: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/test-local-path" podUID="db57d505-80a3-4fb4-b2fe-b9a4b8812a33"
	Nov 24 08:50:57 addons-674149 kubelet[1447]: E1124 08:50:57.757106    1447 log.go:32] "PullImage from image service failed" err=<
	Nov 24 08:50:57 addons-674149 kubelet[1447]:         rpc error: code = Unknown desc = failed to pull and unpack image "docker.io/library/busybox:stable": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/busybox/manifests/sha256:079b4a73854a059a2073c6e1a031b17fcbf23a47c6c59ae760d78045199e403c: 429 Too Many Requests
	Nov 24 08:50:57 addons-674149 kubelet[1447]:         toomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit
	Nov 24 08:50:57 addons-674149 kubelet[1447]:  > image="busybox:stable"
	Nov 24 08:50:57 addons-674149 kubelet[1447]: E1124 08:50:57.757161    1447 kuberuntime_image.go:43] "Failed to pull image" err=<
	Nov 24 08:50:57 addons-674149 kubelet[1447]:         failed to pull and unpack image "docker.io/library/busybox:stable": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/busybox/manifests/sha256:079b4a73854a059a2073c6e1a031b17fcbf23a47c6c59ae760d78045199e403c: 429 Too Many Requests
	Nov 24 08:50:57 addons-674149 kubelet[1447]:         toomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit
	Nov 24 08:50:57 addons-674149 kubelet[1447]:  > image="busybox:stable"
	Nov 24 08:50:57 addons-674149 kubelet[1447]: E1124 08:50:57.757234    1447 kuberuntime_manager.go:1449] "Unhandled Error" err=<
	Nov 24 08:50:57 addons-674149 kubelet[1447]:         container busybox start failed in pod test-local-path_default(db57d505-80a3-4fb4-b2fe-b9a4b8812a33): ErrImagePull: failed to pull and unpack image "docker.io/library/busybox:stable": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/busybox/manifests/sha256:079b4a73854a059a2073c6e1a031b17fcbf23a47c6c59ae760d78045199e403c: 429 Too Many Requests
	Nov 24 08:50:57 addons-674149 kubelet[1447]:         toomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit
	Nov 24 08:50:57 addons-674149 kubelet[1447]:  > logger="UnhandledError"
	Nov 24 08:50:57 addons-674149 kubelet[1447]: E1124 08:50:57.757269    1447 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"busybox\" with ErrImagePull: \"failed to pull and unpack image \\\"docker.io/library/busybox:stable\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/busybox/manifests/sha256:079b4a73854a059a2073c6e1a031b17fcbf23a47c6c59ae760d78045199e403c: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/test-local-path" podUID="db57d505-80a3-4fb4-b2fe-b9a4b8812a33"
	
	
	==> storage-provisioner [03141ec1a42aa9a30c0d9ae9f482a1408582df885ea312c1cd0aea9d75185ad3] <==
	W1124 08:50:38.463859       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 08:50:40.467714       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 08:50:40.474339       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 08:50:42.477392       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 08:50:42.482336       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 08:50:44.485855       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 08:50:44.490300       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 08:50:46.493790       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 08:50:46.499271       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 08:50:48.502315       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 08:50:48.507238       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 08:50:50.511536       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 08:50:50.516737       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 08:50:52.519962       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 08:50:52.527144       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 08:50:54.530526       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 08:50:54.535578       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 08:50:56.538820       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 08:50:56.543635       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 08:50:58.547019       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 08:50:58.552127       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 08:51:00.556177       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 08:51:00.561273       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 08:51:02.565308       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 08:51:02.573399       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p addons-674149 -n addons-674149
helpers_test.go:269: (dbg) Run:  kubectl --context addons-674149 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:280: non-running pods: test-local-path
helpers_test.go:282: ======> post-mortem[TestAddons/parallel/LocalPath]: describe non-running pods <======
helpers_test.go:285: (dbg) Run:  kubectl --context addons-674149 describe pod test-local-path
helpers_test.go:290: (dbg) kubectl --context addons-674149 describe pod test-local-path:

                                                
                                                
-- stdout --
	Name:             test-local-path
	Namespace:        default
	Priority:         0
	Service Account:  default
	Node:             addons-674149/192.168.49.2
	Start Time:       Mon, 24 Nov 2025 08:48:02 +0000
	Labels:           run=test-local-path
	Annotations:      <none>
	Status:           Pending
	IP:               10.244.0.35
	IPs:
	  IP:  10.244.0.35
	Containers:
	  busybox:
	    Container ID:  
	    Image:         busybox:stable
	    Image ID:      
	    Port:          <none>
	    Host Port:     <none>
	    Command:
	      sh
	      -c
	      echo 'local-path-provisioner' > /test/file1
	    State:          Waiting
	      Reason:       ImagePullBackOff
	    Ready:          False
	    Restart Count:  0
	    Environment:    <none>
	    Mounts:
	      /test from data (rw)
	      /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-l7fl2 (ro)
	Conditions:
	  Type                        Status
	  PodReadyToStartContainers   True 
	  Initialized                 True 
	  Ready                       False 
	  ContainersReady             False 
	  PodScheduled                True 
	Volumes:
	  data:
	    Type:       PersistentVolumeClaim (a reference to a PersistentVolumeClaim in the same namespace)
	    ClaimName:  test-pvc
	    ReadOnly:   false
	  kube-api-access-l7fl2:
	    Type:                    Projected (a volume that contains injected data from multiple sources)
	    TokenExpirationSeconds:  3607
	    ConfigMapName:           kube-root-ca.crt
	    Optional:                false
	    DownwardAPI:             true
	QoS Class:                   BestEffort
	Node-Selectors:              <none>
	Tolerations:                 node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
	                             node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
	Events:
	  Type     Reason     Age                 From               Message
	  ----     ------     ----                ----               -------
	  Normal   Scheduled  3m3s                default-scheduler  Successfully assigned default/test-local-path to addons-674149
	  Warning  Failed     98s (x4 over 3m2s)  kubelet            Failed to pull image "busybox:stable": failed to pull and unpack image "docker.io/library/busybox:stable": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/busybox/manifests/sha256:355b3a1bf5609da364166913878a8508d4ba30572d02020a97028c75477e24ff: 429 Too Many Requests
	toomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit
	  Normal   BackOff  20s (x10 over 3m2s)  kubelet  Back-off pulling image "busybox:stable"
	  Warning  Failed   20s (x10 over 3m2s)  kubelet  Error: ImagePullBackOff
	  Normal   Pulling  8s (x5 over 3m3s)    kubelet  Pulling image "busybox:stable"
	  Warning  Failed   8s (x5 over 3m2s)    kubelet  Error: ErrImagePull
	  Warning  Failed   8s                   kubelet  Failed to pull image "busybox:stable": failed to pull and unpack image "docker.io/library/busybox:stable": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/busybox/manifests/sha256:079b4a73854a059a2073c6e1a031b17fcbf23a47c6c59ae760d78045199e403c: 429 Too Many Requests
	toomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit

                                                
                                                
-- /stdout --
helpers_test.go:293: <<< TestAddons/parallel/LocalPath FAILED: end of post-mortem logs <<<
helpers_test.go:294: ---------------------/post-mortem---------------------------------
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-674149 addons disable storage-provisioner-rancher --alsologtostderr -v=1
addons_test.go:1053: (dbg) Done: out/minikube-linux-arm64 -p addons-674149 addons disable storage-provisioner-rancher --alsologtostderr -v=1: (42.8380805s)
--- FAIL: TestAddons/parallel/LocalPath (231.34s)

                                                
                                    
x
+
TestDockerEnvContainerd (51.16s)

                                                
                                                
=== RUN   TestDockerEnvContainerd
docker_test.go:170: running with containerd true linux arm64
docker_test.go:181: (dbg) Run:  out/minikube-linux-arm64 start -p dockerenv-050504 --driver=docker  --container-runtime=containerd
docker_test.go:181: (dbg) Done: out/minikube-linux-arm64 start -p dockerenv-050504 --driver=docker  --container-runtime=containerd: (32.813763366s)
docker_test.go:189: (dbg) Run:  /bin/bash -c "out/minikube-linux-arm64 docker-env --ssh-host --ssh-add -p dockerenv-050504"
docker_test.go:189: (dbg) Done: /bin/bash -c "out/minikube-linux-arm64 docker-env --ssh-host --ssh-add -p dockerenv-050504": (1.082852738s)
docker_test.go:220: (dbg) Run:  /bin/bash -c "SSH_AUTH_SOCK="/tmp/ssh-L6a1lBmDlWwv/agent.1674021" SSH_AGENT_PID="1674022" DOCKER_HOST=ssh://docker@127.0.0.1:34669 docker version"
docker_test.go:243: (dbg) Run:  /bin/bash -c "SSH_AUTH_SOCK="/tmp/ssh-L6a1lBmDlWwv/agent.1674021" SSH_AGENT_PID="1674022" DOCKER_HOST=ssh://docker@127.0.0.1:34669 DOCKER_BUILDKIT=0 docker build -t local/minikube-dockerenv-containerd-test:latest testdata/docker-env"
docker_test.go:243: (dbg) Non-zero exit: /bin/bash -c "SSH_AUTH_SOCK="/tmp/ssh-L6a1lBmDlWwv/agent.1674021" SSH_AGENT_PID="1674022" DOCKER_HOST=ssh://docker@127.0.0.1:34669 DOCKER_BUILDKIT=0 docker build -t local/minikube-dockerenv-containerd-test:latest testdata/docker-env": exit status 1 (949.656411ms)

                                                
                                                
-- stdout --
	Sending build context to Docker daemon  2.048kB

                                                
                                                
-- /stdout --
** stderr ** 
	DEPRECATED: The legacy builder is deprecated and will be removed in a future release.
	            BuildKit is currently disabled; enable it by removing the DOCKER_BUILDKIT=0
	            environment-variable.
	
	Error response from daemon: exit status 1

                                                
                                                
** /stderr **
docker_test.go:245: failed to build images, error: exit status 1, output:
-- stdout --
	Sending build context to Docker daemon  2.048kB

                                                
                                                
-- /stdout --
** stderr ** 
	DEPRECATED: The legacy builder is deprecated and will be removed in a future release.
	            BuildKit is currently disabled; enable it by removing the DOCKER_BUILDKIT=0
	            environment-variable.
	
	Error response from daemon: exit status 1

                                                
                                                
** /stderr **
docker_test.go:250: (dbg) Run:  /bin/bash -c "SSH_AUTH_SOCK="/tmp/ssh-L6a1lBmDlWwv/agent.1674021" SSH_AGENT_PID="1674022" DOCKER_HOST=ssh://docker@127.0.0.1:34669 docker image ls"
docker_test.go:255: failed to detect image 'local/minikube-dockerenv-containerd-test' in output of docker image ls
panic.go:615: *** TestDockerEnvContainerd FAILED at 2025-11-24 08:52:53.252947836 +0000 UTC m=+588.620254382
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestDockerEnvContainerd]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestDockerEnvContainerd]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect dockerenv-050504
helpers_test.go:243: (dbg) docker inspect dockerenv-050504:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "1c615061c075e4fcc55bacc95238cc709a4213d124cd9f7e3f9607a1368e4613",
	        "Created": "2025-11-24T08:52:11.913059467Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 1671714,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-11-24T08:52:11.982164552Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:572c983e466f1f784136812eef5cc59ac623db764bc7704d3676c4643993fd08",
	        "ResolvConfPath": "/var/lib/docker/containers/1c615061c075e4fcc55bacc95238cc709a4213d124cd9f7e3f9607a1368e4613/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/1c615061c075e4fcc55bacc95238cc709a4213d124cd9f7e3f9607a1368e4613/hostname",
	        "HostsPath": "/var/lib/docker/containers/1c615061c075e4fcc55bacc95238cc709a4213d124cd9f7e3f9607a1368e4613/hosts",
	        "LogPath": "/var/lib/docker/containers/1c615061c075e4fcc55bacc95238cc709a4213d124cd9f7e3f9607a1368e4613/1c615061c075e4fcc55bacc95238cc709a4213d124cd9f7e3f9607a1368e4613-json.log",
	        "Name": "/dockerenv-050504",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "dockerenv-050504:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "dockerenv-050504",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 3221225472,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 6442450944,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "1c615061c075e4fcc55bacc95238cc709a4213d124cd9f7e3f9607a1368e4613",
	                "LowerDir": "/var/lib/docker/overlay2/ffd6807451aed5ac8834b634f9ab6b40af762a75e98529ffa933f915742fbd76-init/diff:/var/lib/docker/overlay2/bf294786c7ade59cb761c7bdcc27045362c3779b00a569b685443f34ade17737/diff",
	                "MergedDir": "/var/lib/docker/overlay2/ffd6807451aed5ac8834b634f9ab6b40af762a75e98529ffa933f915742fbd76/merged",
	                "UpperDir": "/var/lib/docker/overlay2/ffd6807451aed5ac8834b634f9ab6b40af762a75e98529ffa933f915742fbd76/diff",
	                "WorkDir": "/var/lib/docker/overlay2/ffd6807451aed5ac8834b634f9ab6b40af762a75e98529ffa933f915742fbd76/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "dockerenv-050504",
	                "Source": "/var/lib/docker/volumes/dockerenv-050504/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "dockerenv-050504",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8443/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "dockerenv-050504",
	                "name.minikube.sigs.k8s.io": "dockerenv-050504",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "08925f2d03eb80e117f22952b7a13bb90d188ab9bb2ed4cb8eb9b8611e549e5b",
	            "SandboxKey": "/var/run/docker/netns/08925f2d03eb",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34669"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34670"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34673"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34671"
	                    }
	                ],
	                "8443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34672"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "dockerenv-050504": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "02:58:13:ca:5b:bf",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "0481e16e10c856161dfcf87086cb425a2f21d8246733564adcd8b96922065aea",
	                    "EndpointID": "cb3114d154401a714cb59f356a341d3dc8dfcbf9b7f2f3a75e83ebb6b22955fd",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "dockerenv-050504",
	                        "1c615061c075"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p dockerenv-050504 -n dockerenv-050504
helpers_test.go:252: <<< TestDockerEnvContainerd FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestDockerEnvContainerd]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p dockerenv-050504 logs -n 25
helpers_test.go:255: (dbg) Done: out/minikube-linux-arm64 -p dockerenv-050504 logs -n 25: (1.365217736s)
helpers_test.go:260: TestDockerEnvContainerd logs: 
-- stdout --
	
	==> Audit <==
	┌────────────┬─────────────────────────────────────────────────────────────────────────────────┬──────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│  COMMAND   │                                      ARGS                                       │     PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├────────────┼─────────────────────────────────────────────────────────────────────────────────┼──────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ addons     │ enable headlamp -p addons-674149 --alsologtostderr -v=1                         │ addons-674149    │ jenkins │ v1.37.0 │ 24 Nov 25 08:47 UTC │ 24 Nov 25 08:47 UTC │
	│ addons     │ addons-674149 addons disable headlamp --alsologtostderr -v=1                    │ addons-674149    │ jenkins │ v1.37.0 │ 24 Nov 25 08:47 UTC │ 24 Nov 25 08:47 UTC │
	│ ip         │ addons-674149 ip                                                                │ addons-674149    │ jenkins │ v1.37.0 │ 24 Nov 25 08:47 UTC │ 24 Nov 25 08:47 UTC │
	│ addons     │ addons-674149 addons disable registry --alsologtostderr -v=1                    │ addons-674149    │ jenkins │ v1.37.0 │ 24 Nov 25 08:47 UTC │ 24 Nov 25 08:47 UTC │
	│ addons     │ addons-674149 addons disable metrics-server --alsologtostderr -v=1              │ addons-674149    │ jenkins │ v1.37.0 │ 24 Nov 25 08:47 UTC │ 24 Nov 25 08:47 UTC │
	│ addons     │ addons-674149 addons disable inspektor-gadget --alsologtostderr -v=1            │ addons-674149    │ jenkins │ v1.37.0 │ 24 Nov 25 08:47 UTC │ 24 Nov 25 08:47 UTC │
	│ addons     │ addons-674149 addons disable volumesnapshots --alsologtostderr -v=1             │ addons-674149    │ jenkins │ v1.37.0 │ 24 Nov 25 08:47 UTC │ 24 Nov 25 08:47 UTC │
	│ ssh        │ addons-674149 ssh curl -s http://127.0.0.1/ -H 'Host: nginx.example.com'        │ addons-674149    │ jenkins │ v1.37.0 │ 24 Nov 25 08:47 UTC │ 24 Nov 25 08:47 UTC │
	│ ip         │ addons-674149 ip                                                                │ addons-674149    │ jenkins │ v1.37.0 │ 24 Nov 25 08:47 UTC │ 24 Nov 25 08:47 UTC │
	│ addons     │ addons-674149 addons disable csi-hostpath-driver --alsologtostderr -v=1         │ addons-674149    │ jenkins │ v1.37.0 │ 24 Nov 25 08:47 UTC │ 24 Nov 25 08:47 UTC │
	│ addons     │ addons-674149 addons disable ingress-dns --alsologtostderr -v=1                 │ addons-674149    │ jenkins │ v1.37.0 │ 24 Nov 25 08:47 UTC │ 24 Nov 25 08:47 UTC │
	│ addons     │ addons-674149 addons disable ingress --alsologtostderr -v=1                     │ addons-674149    │ jenkins │ v1.37.0 │ 24 Nov 25 08:47 UTC │ 24 Nov 25 08:47 UTC │
	│ addons     │ configure registry-creds -f ./testdata/addons_testconfig.json -p addons-674149  │ addons-674149    │ jenkins │ v1.37.0 │ 24 Nov 25 08:47 UTC │ 24 Nov 25 08:47 UTC │
	│ addons     │ addons-674149 addons disable registry-creds --alsologtostderr -v=1              │ addons-674149    │ jenkins │ v1.37.0 │ 24 Nov 25 08:47 UTC │ 24 Nov 25 08:47 UTC │
	│ addons     │ addons-674149 addons disable yakd --alsologtostderr -v=1                        │ addons-674149    │ jenkins │ v1.37.0 │ 24 Nov 25 08:48 UTC │ 24 Nov 25 08:48 UTC │
	│ addons     │ addons-674149 addons disable nvidia-device-plugin --alsologtostderr -v=1        │ addons-674149    │ jenkins │ v1.37.0 │ 24 Nov 25 08:48 UTC │ 24 Nov 25 08:48 UTC │
	│ addons     │ addons-674149 addons disable cloud-spanner --alsologtostderr -v=1               │ addons-674149    │ jenkins │ v1.37.0 │ 24 Nov 25 08:48 UTC │ 24 Nov 25 08:48 UTC │
	│ addons     │ addons-674149 addons disable storage-provisioner-rancher --alsologtostderr -v=1 │ addons-674149    │ jenkins │ v1.37.0 │ 24 Nov 25 08:51 UTC │ 24 Nov 25 08:51 UTC │
	│ stop       │ -p addons-674149                                                                │ addons-674149    │ jenkins │ v1.37.0 │ 24 Nov 25 08:51 UTC │ 24 Nov 25 08:52 UTC │
	│ addons     │ enable dashboard -p addons-674149                                               │ addons-674149    │ jenkins │ v1.37.0 │ 24 Nov 25 08:52 UTC │ 24 Nov 25 08:52 UTC │
	│ addons     │ disable dashboard -p addons-674149                                              │ addons-674149    │ jenkins │ v1.37.0 │ 24 Nov 25 08:52 UTC │ 24 Nov 25 08:52 UTC │
	│ addons     │ disable gvisor -p addons-674149                                                 │ addons-674149    │ jenkins │ v1.37.0 │ 24 Nov 25 08:52 UTC │ 24 Nov 25 08:52 UTC │
	│ delete     │ -p addons-674149                                                                │ addons-674149    │ jenkins │ v1.37.0 │ 24 Nov 25 08:52 UTC │ 24 Nov 25 08:52 UTC │
	│ start      │ -p dockerenv-050504 --driver=docker  --container-runtime=containerd             │ dockerenv-050504 │ jenkins │ v1.37.0 │ 24 Nov 25 08:52 UTC │ 24 Nov 25 08:52 UTC │
	│ docker-env │ --ssh-host --ssh-add -p dockerenv-050504                                        │ dockerenv-050504 │ jenkins │ v1.37.0 │ 24 Nov 25 08:52 UTC │ 24 Nov 25 08:52 UTC │
	└────────────┴─────────────────────────────────────────────────────────────────────────────────┴──────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/11/24 08:52:06
	Running on machine: ip-172-31-30-239
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1124 08:52:06.908434 1671319 out.go:360] Setting OutFile to fd 1 ...
	I1124 08:52:06.908542 1671319 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 08:52:06.908545 1671319 out.go:374] Setting ErrFile to fd 2...
	I1124 08:52:06.908549 1671319 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 08:52:06.908813 1671319 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
	I1124 08:52:06.909187 1671319 out.go:368] Setting JSON to false
	I1124 08:52:06.909986 1671319 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":27256,"bootTime":1763947071,"procs":148,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1124 08:52:06.910042 1671319 start.go:143] virtualization:  
	I1124 08:52:06.912184 1671319 out.go:179] * [dockerenv-050504] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1124 08:52:06.913411 1671319 out.go:179]   - MINIKUBE_LOCATION=21978
	I1124 08:52:06.913551 1671319 notify.go:221] Checking for updates...
	I1124 08:52:06.915928 1671319 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1124 08:52:06.917172 1671319 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 08:52:06.918348 1671319 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21978-1652607/.minikube
	I1124 08:52:06.919422 1671319 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1124 08:52:06.920439 1671319 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1124 08:52:06.921793 1671319 driver.go:422] Setting default libvirt URI to qemu:///system
	I1124 08:52:06.943292 1671319 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1124 08:52:06.943419 1671319 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 08:52:07.018443 1671319 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:2 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:23 OomKillDisable:true NGoroutines:42 SystemTime:2025-11-24 08:52:07.009065431 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 08:52:07.018566 1671319 docker.go:319] overlay module found
	I1124 08:52:07.019912 1671319 out.go:179] * Using the docker driver based on user configuration
	I1124 08:52:07.020966 1671319 start.go:309] selected driver: docker
	I1124 08:52:07.020973 1671319 start.go:927] validating driver "docker" against <nil>
	I1124 08:52:07.020985 1671319 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1124 08:52:07.021101 1671319 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 08:52:07.075113 1671319 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:2 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:23 OomKillDisable:true NGoroutines:42 SystemTime:2025-11-24 08:52:07.066277111 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 08:52:07.075252 1671319 start_flags.go:327] no existing cluster config was found, will generate one from the flags 
	I1124 08:52:07.075547 1671319 start_flags.go:410] Using suggested 3072MB memory alloc based on sys=7834MB, container=7834MB
	I1124 08:52:07.075711 1671319 start_flags.go:974] Wait components to verify : map[apiserver:true system_pods:true]
	I1124 08:52:07.076961 1671319 out.go:179] * Using Docker driver with root privileges
	I1124 08:52:07.078400 1671319 cni.go:84] Creating CNI manager for ""
	I1124 08:52:07.078549 1671319 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1124 08:52:07.078559 1671319 start_flags.go:336] Found "CNI" CNI - setting NetworkPlugin=cni
	I1124 08:52:07.078633 1671319 start.go:353] cluster config:
	{Name:dockerenv-050504 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.2 ClusterName:dockerenv-050504 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerR
untime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.34.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 08:52:07.080413 1671319 out.go:179] * Starting "dockerenv-050504" primary control-plane node in "dockerenv-050504" cluster
	I1124 08:52:07.082280 1671319 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1124 08:52:07.084202 1671319 out.go:179] * Pulling base image v0.0.48-1763789673-21948 ...
	I1124 08:52:07.086215 1671319 preload.go:188] Checking if preload exists for k8s version v1.34.2 and runtime containerd
	I1124 08:52:07.086255 1671319 preload.go:203] Found local preload: /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.2-containerd-overlay2-arm64.tar.lz4
	I1124 08:52:07.086263 1671319 cache.go:65] Caching tarball of preloaded images
	I1124 08:52:07.086284 1671319 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f in local docker daemon
	I1124 08:52:07.086359 1671319 preload.go:238] Found /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.2-containerd-overlay2-arm64.tar.lz4 in cache, skipping download
	I1124 08:52:07.086368 1671319 cache.go:68] Finished verifying existence of preloaded tar for v1.34.2 on containerd
	I1124 08:52:07.086812 1671319 profile.go:143] Saving config to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/dockerenv-050504/config.json ...
	I1124 08:52:07.086834 1671319 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/dockerenv-050504/config.json: {Name:mkd5118c6585a650a58cf3c718005caac57f7df7 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 08:52:07.111933 1671319 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f in local docker daemon, skipping pull
	I1124 08:52:07.111944 1671319 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f exists in daemon, skipping load
	I1124 08:52:07.111957 1671319 cache.go:243] Successfully downloaded all kic artifacts
	I1124 08:52:07.111986 1671319 start.go:360] acquireMachinesLock for dockerenv-050504: {Name:mk21df1db1b47e78b140ce7bc6191cc42d2c9f52 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 08:52:07.112086 1671319 start.go:364] duration metric: took 86.63µs to acquireMachinesLock for "dockerenv-050504"
	I1124 08:52:07.112109 1671319 start.go:93] Provisioning new machine with config: &{Name:dockerenv-050504 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.2 ClusterName:dockerenv-050504 Namespace:default APIServerHAVIP: APIServerNa
me:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.34.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAut
hSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.34.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1124 08:52:07.113000 1671319 start.go:125] createHost starting for "" (driver="docker")
	I1124 08:52:07.115710 1671319 out.go:252] * Creating docker container (CPUs=2, Memory=3072MB) ...
	I1124 08:52:07.115941 1671319 start.go:159] libmachine.API.Create for "dockerenv-050504" (driver="docker")
	I1124 08:52:07.115972 1671319 client.go:173] LocalClient.Create starting
	I1124 08:52:07.116047 1671319 main.go:143] libmachine: Reading certificate data from /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem
	I1124 08:52:07.116080 1671319 main.go:143] libmachine: Decoding PEM data...
	I1124 08:52:07.116098 1671319 main.go:143] libmachine: Parsing certificate...
	I1124 08:52:07.116147 1671319 main.go:143] libmachine: Reading certificate data from /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem
	I1124 08:52:07.116163 1671319 main.go:143] libmachine: Decoding PEM data...
	I1124 08:52:07.116173 1671319 main.go:143] libmachine: Parsing certificate...
	I1124 08:52:07.116531 1671319 cli_runner.go:164] Run: docker network inspect dockerenv-050504 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	W1124 08:52:07.132567 1671319 cli_runner.go:211] docker network inspect dockerenv-050504 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}" returned with exit code 1
	I1124 08:52:07.132643 1671319 network_create.go:284] running [docker network inspect dockerenv-050504] to gather additional debugging logs...
	I1124 08:52:07.132659 1671319 cli_runner.go:164] Run: docker network inspect dockerenv-050504
	W1124 08:52:07.149145 1671319 cli_runner.go:211] docker network inspect dockerenv-050504 returned with exit code 1
	I1124 08:52:07.149165 1671319 network_create.go:287] error running [docker network inspect dockerenv-050504]: docker network inspect dockerenv-050504: exit status 1
	stdout:
	[]
	
	stderr:
	Error response from daemon: network dockerenv-050504 not found
	I1124 08:52:07.149177 1671319 network_create.go:289] output of [docker network inspect dockerenv-050504]: -- stdout --
	[]
	
	-- /stdout --
	** stderr ** 
	Error response from daemon: network dockerenv-050504 not found
	
	** /stderr **
	I1124 08:52:07.149283 1671319 cli_runner.go:164] Run: docker network inspect bridge --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1124 08:52:07.166142 1671319 network.go:206] using free private subnet 192.168.49.0/24: &{IP:192.168.49.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.49.0/24 Gateway:192.168.49.1 ClientMin:192.168.49.2 ClientMax:192.168.49.254 Broadcast:192.168.49.255 IsPrivate:true Interface:{IfaceName: IfaceIPv4: IfaceMTU:0 IfaceMAC:} reservation:0x40018c3cc0}
	I1124 08:52:07.166174 1671319 network_create.go:124] attempt to create docker network dockerenv-050504 192.168.49.0/24 with gateway 192.168.49.1 and MTU of 1500 ...
	I1124 08:52:07.166227 1671319 cli_runner.go:164] Run: docker network create --driver=bridge --subnet=192.168.49.0/24 --gateway=192.168.49.1 -o --ip-masq -o --icc -o com.docker.network.driver.mtu=1500 --label=created_by.minikube.sigs.k8s.io=true --label=name.minikube.sigs.k8s.io=dockerenv-050504 dockerenv-050504
	I1124 08:52:07.217640 1671319 network_create.go:108] docker network dockerenv-050504 192.168.49.0/24 created
	I1124 08:52:07.217662 1671319 kic.go:121] calculated static IP "192.168.49.2" for the "dockerenv-050504" container
	I1124 08:52:07.217740 1671319 cli_runner.go:164] Run: docker ps -a --format {{.Names}}
	I1124 08:52:07.231723 1671319 cli_runner.go:164] Run: docker volume create dockerenv-050504 --label name.minikube.sigs.k8s.io=dockerenv-050504 --label created_by.minikube.sigs.k8s.io=true
	I1124 08:52:07.249279 1671319 oci.go:103] Successfully created a docker volume dockerenv-050504
	I1124 08:52:07.249368 1671319 cli_runner.go:164] Run: docker run --rm --name dockerenv-050504-preload-sidecar --label created_by.minikube.sigs.k8s.io=true --label name.minikube.sigs.k8s.io=dockerenv-050504 --entrypoint /usr/bin/test -v dockerenv-050504:/var gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f -d /var/lib
	I1124 08:52:07.797631 1671319 oci.go:107] Successfully prepared a docker volume dockerenv-050504
	I1124 08:52:07.797688 1671319 preload.go:188] Checking if preload exists for k8s version v1.34.2 and runtime containerd
	I1124 08:52:07.797696 1671319 kic.go:194] Starting extracting preloaded images to volume ...
	I1124 08:52:07.797776 1671319 cli_runner.go:164] Run: docker run --rm --entrypoint /usr/bin/tar -v /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.2-containerd-overlay2-arm64.tar.lz4:/preloaded.tar:ro -v dockerenv-050504:/extractDir gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f -I lz4 -xf /preloaded.tar -C /extractDir
	I1124 08:52:11.840300 1671319 cli_runner.go:217] Completed: docker run --rm --entrypoint /usr/bin/tar -v /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.2-containerd-overlay2-arm64.tar.lz4:/preloaded.tar:ro -v dockerenv-050504:/extractDir gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f -I lz4 -xf /preloaded.tar -C /extractDir: (4.042489665s)
	I1124 08:52:11.840322 1671319 kic.go:203] duration metric: took 4.042622828s to extract preloaded images to volume ...
	W1124 08:52:11.840713 1671319 cgroups_linux.go:77] Your kernel does not support swap limit capabilities or the cgroup is not mounted.
	I1124 08:52:11.840827 1671319 cli_runner.go:164] Run: docker info --format "'{{json .SecurityOptions}}'"
	I1124 08:52:11.898047 1671319 cli_runner.go:164] Run: docker run -d -t --privileged --security-opt seccomp=unconfined --tmpfs /tmp --tmpfs /run -v /lib/modules:/lib/modules:ro --hostname dockerenv-050504 --name dockerenv-050504 --label created_by.minikube.sigs.k8s.io=true --label name.minikube.sigs.k8s.io=dockerenv-050504 --label role.minikube.sigs.k8s.io= --label mode.minikube.sigs.k8s.io=dockerenv-050504 --network dockerenv-050504 --ip 192.168.49.2 --volume dockerenv-050504:/var --security-opt apparmor=unconfined --memory=3072mb --cpus=2 -e container=docker --expose 8443 --publish=127.0.0.1::8443 --publish=127.0.0.1::22 --publish=127.0.0.1::2376 --publish=127.0.0.1::5000 --publish=127.0.0.1::32443 gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f
	I1124 08:52:12.217872 1671319 cli_runner.go:164] Run: docker container inspect dockerenv-050504 --format={{.State.Running}}
	I1124 08:52:12.239795 1671319 cli_runner.go:164] Run: docker container inspect dockerenv-050504 --format={{.State.Status}}
	I1124 08:52:12.265676 1671319 cli_runner.go:164] Run: docker exec dockerenv-050504 stat /var/lib/dpkg/alternatives/iptables
	I1124 08:52:12.323193 1671319 oci.go:144] the created container "dockerenv-050504" has a running status.
	I1124 08:52:12.323223 1671319 kic.go:225] Creating ssh key for kic: /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/dockerenv-050504/id_rsa...
	I1124 08:52:12.640370 1671319 kic_runner.go:191] docker (temp): /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/dockerenv-050504/id_rsa.pub --> /home/docker/.ssh/authorized_keys (381 bytes)
	I1124 08:52:12.663710 1671319 cli_runner.go:164] Run: docker container inspect dockerenv-050504 --format={{.State.Status}}
	I1124 08:52:12.702813 1671319 kic_runner.go:93] Run: chown docker:docker /home/docker/.ssh/authorized_keys
	I1124 08:52:12.702824 1671319 kic_runner.go:114] Args: [docker exec --privileged dockerenv-050504 chown docker:docker /home/docker/.ssh/authorized_keys]
	I1124 08:52:12.771156 1671319 cli_runner.go:164] Run: docker container inspect dockerenv-050504 --format={{.State.Status}}
	I1124 08:52:12.802066 1671319 machine.go:94] provisionDockerMachine start ...
	I1124 08:52:12.802337 1671319 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" dockerenv-050504
	I1124 08:52:12.835936 1671319 main.go:143] libmachine: Using SSH client type: native
	I1124 08:52:12.836269 1671319 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34669 <nil> <nil>}
	I1124 08:52:12.836276 1671319 main.go:143] libmachine: About to run SSH command:
	hostname
	I1124 08:52:12.836948 1671319 main.go:143] libmachine: Error dialing TCP: ssh: handshake failed: EOF
	I1124 08:52:15.990094 1671319 main.go:143] libmachine: SSH cmd err, output: <nil>: dockerenv-050504
	
	I1124 08:52:15.990109 1671319 ubuntu.go:182] provisioning hostname "dockerenv-050504"
	I1124 08:52:15.990180 1671319 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" dockerenv-050504
	I1124 08:52:16.017705 1671319 main.go:143] libmachine: Using SSH client type: native
	I1124 08:52:16.018010 1671319 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34669 <nil> <nil>}
	I1124 08:52:16.018019 1671319 main.go:143] libmachine: About to run SSH command:
	sudo hostname dockerenv-050504 && echo "dockerenv-050504" | sudo tee /etc/hostname
	I1124 08:52:16.179989 1671319 main.go:143] libmachine: SSH cmd err, output: <nil>: dockerenv-050504
	
	I1124 08:52:16.180081 1671319 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" dockerenv-050504
	I1124 08:52:16.202427 1671319 main.go:143] libmachine: Using SSH client type: native
	I1124 08:52:16.202776 1671319 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34669 <nil> <nil>}
	I1124 08:52:16.202791 1671319 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sdockerenv-050504' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 dockerenv-050504/g' /etc/hosts;
				else 
					echo '127.0.1.1 dockerenv-050504' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1124 08:52:16.354663 1671319 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1124 08:52:16.354680 1671319 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/21978-1652607/.minikube CaCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/21978-1652607/.minikube}
	I1124 08:52:16.354709 1671319 ubuntu.go:190] setting up certificates
	I1124 08:52:16.354717 1671319 provision.go:84] configureAuth start
	I1124 08:52:16.354787 1671319 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" dockerenv-050504
	I1124 08:52:16.372380 1671319 provision.go:143] copyHostCerts
	I1124 08:52:16.372460 1671319 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem, removing ...
	I1124 08:52:16.372469 1671319 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem
	I1124 08:52:16.372549 1671319 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem (1078 bytes)
	I1124 08:52:16.372645 1671319 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem, removing ...
	I1124 08:52:16.372649 1671319 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem
	I1124 08:52:16.372673 1671319 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem (1123 bytes)
	I1124 08:52:16.372731 1671319 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem, removing ...
	I1124 08:52:16.372734 1671319 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem
	I1124 08:52:16.372756 1671319 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem (1679 bytes)
	I1124 08:52:16.372821 1671319 provision.go:117] generating server cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem org=jenkins.dockerenv-050504 san=[127.0.0.1 192.168.49.2 dockerenv-050504 localhost minikube]
	I1124 08:52:16.453087 1671319 provision.go:177] copyRemoteCerts
	I1124 08:52:16.453151 1671319 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1124 08:52:16.453191 1671319 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" dockerenv-050504
	I1124 08:52:16.471408 1671319 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34669 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/dockerenv-050504/id_rsa Username:docker}
	I1124 08:52:16.578090 1671319 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1124 08:52:16.598804 1671319 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem --> /etc/docker/server.pem (1216 bytes)
	I1124 08:52:16.618093 1671319 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1124 08:52:16.636978 1671319 provision.go:87] duration metric: took 282.235333ms to configureAuth
	I1124 08:52:16.636996 1671319 ubuntu.go:206] setting minikube options for container-runtime
	I1124 08:52:16.637195 1671319 config.go:182] Loaded profile config "dockerenv-050504": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
	I1124 08:52:16.637206 1671319 machine.go:97] duration metric: took 3.835125343s to provisionDockerMachine
	I1124 08:52:16.637216 1671319 client.go:176] duration metric: took 9.521237509s to LocalClient.Create
	I1124 08:52:16.637241 1671319 start.go:167] duration metric: took 9.521301248s to libmachine.API.Create "dockerenv-050504"
	I1124 08:52:16.637247 1671319 start.go:293] postStartSetup for "dockerenv-050504" (driver="docker")
	I1124 08:52:16.637255 1671319 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1124 08:52:16.637305 1671319 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1124 08:52:16.637349 1671319 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" dockerenv-050504
	I1124 08:52:16.654878 1671319 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34669 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/dockerenv-050504/id_rsa Username:docker}
	I1124 08:52:16.758761 1671319 ssh_runner.go:195] Run: cat /etc/os-release
	I1124 08:52:16.762292 1671319 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1124 08:52:16.762310 1671319 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1124 08:52:16.762320 1671319 filesync.go:126] Scanning /home/jenkins/minikube-integration/21978-1652607/.minikube/addons for local assets ...
	I1124 08:52:16.762382 1671319 filesync.go:126] Scanning /home/jenkins/minikube-integration/21978-1652607/.minikube/files for local assets ...
	I1124 08:52:16.762401 1671319 start.go:296] duration metric: took 125.149271ms for postStartSetup
	I1124 08:52:16.762766 1671319 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" dockerenv-050504
	I1124 08:52:16.780092 1671319 profile.go:143] Saving config to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/dockerenv-050504/config.json ...
	I1124 08:52:16.780368 1671319 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1124 08:52:16.780410 1671319 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" dockerenv-050504
	I1124 08:52:16.797433 1671319 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34669 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/dockerenv-050504/id_rsa Username:docker}
	I1124 08:52:16.899576 1671319 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1124 08:52:16.904108 1671319 start.go:128] duration metric: took 9.791091193s to createHost
	I1124 08:52:16.904123 1671319 start.go:83] releasing machines lock for "dockerenv-050504", held for 9.792031619s
	I1124 08:52:16.904190 1671319 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" dockerenv-050504
	I1124 08:52:16.921138 1671319 ssh_runner.go:195] Run: cat /version.json
	I1124 08:52:16.921191 1671319 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" dockerenv-050504
	I1124 08:52:16.921460 1671319 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1124 08:52:16.921522 1671319 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" dockerenv-050504
	I1124 08:52:16.946781 1671319 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34669 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/dockerenv-050504/id_rsa Username:docker}
	I1124 08:52:16.955959 1671319 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34669 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/dockerenv-050504/id_rsa Username:docker}
	I1124 08:52:17.054307 1671319 ssh_runner.go:195] Run: systemctl --version
	I1124 08:52:17.144987 1671319 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1124 08:52:17.149327 1671319 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1124 08:52:17.149387 1671319 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1124 08:52:17.179613 1671319 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist, /etc/cni/net.d/10-crio-bridge.conflist.disabled] bridge cni config(s)
	I1124 08:52:17.179627 1671319 start.go:496] detecting cgroup driver to use...
	I1124 08:52:17.179657 1671319 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1124 08:52:17.179706 1671319 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1124 08:52:17.195277 1671319 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1124 08:52:17.207688 1671319 docker.go:218] disabling cri-docker service (if available) ...
	I1124 08:52:17.207742 1671319 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1124 08:52:17.224654 1671319 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1124 08:52:17.243313 1671319 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1124 08:52:17.350726 1671319 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1124 08:52:17.472403 1671319 docker.go:234] disabling docker service ...
	I1124 08:52:17.472460 1671319 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1124 08:52:17.494523 1671319 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1124 08:52:17.507993 1671319 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1124 08:52:17.620875 1671319 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1124 08:52:17.761451 1671319 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1124 08:52:17.776966 1671319 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1124 08:52:17.790989 1671319 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm.sha256
	I1124 08:52:17.946506 1671319 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1124 08:52:17.955902 1671319 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1124 08:52:17.964574 1671319 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1124 08:52:17.964635 1671319 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1124 08:52:17.973133 1671319 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1124 08:52:17.982448 1671319 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1124 08:52:17.991517 1671319 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1124 08:52:18.001312 1671319 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1124 08:52:18.012150 1671319 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1124 08:52:18.022563 1671319 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1124 08:52:18.032622 1671319 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1124 08:52:18.042281 1671319 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1124 08:52:18.050549 1671319 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1124 08:52:18.058595 1671319 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1124 08:52:18.177612 1671319 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1124 08:52:18.310848 1671319 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1124 08:52:18.310913 1671319 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1124 08:52:18.315286 1671319 start.go:564] Will wait 60s for crictl version
	I1124 08:52:18.315340 1671319 ssh_runner.go:195] Run: which crictl
	I1124 08:52:18.319990 1671319 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1124 08:52:18.346607 1671319 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.1.5
	RuntimeApiVersion:  v1
	I1124 08:52:18.346701 1671319 ssh_runner.go:195] Run: containerd --version
	I1124 08:52:18.367197 1671319 ssh_runner.go:195] Run: containerd --version
	I1124 08:52:18.398728 1671319 out.go:179] * Preparing Kubernetes v1.34.2 on containerd 2.1.5 ...
	I1124 08:52:18.401666 1671319 cli_runner.go:164] Run: docker network inspect dockerenv-050504 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1124 08:52:18.418359 1671319 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1124 08:52:18.422202 1671319 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.49.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1124 08:52:18.431713 1671319 kubeadm.go:884] updating cluster {Name:dockerenv-050504 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.2 ClusterName:dockerenv-050504 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APISe
rverNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8443 KubernetesVersion:v1.34.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock:
SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1124 08:52:18.431888 1671319 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm.sha256
	I1124 08:52:18.605943 1671319 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm.sha256
	I1124 08:52:18.787368 1671319 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm.sha256
	I1124 08:52:18.941151 1671319 preload.go:188] Checking if preload exists for k8s version v1.34.2 and runtime containerd
	I1124 08:52:18.941322 1671319 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm.sha256
	I1124 08:52:19.095643 1671319 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm.sha256
	I1124 08:52:19.237538 1671319 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm.sha256
	I1124 08:52:19.379786 1671319 ssh_runner.go:195] Run: sudo crictl images --output json
	I1124 08:52:19.409100 1671319 containerd.go:627] all images are preloaded for containerd runtime.
	I1124 08:52:19.409111 1671319 containerd.go:534] Images already preloaded, skipping extraction
	I1124 08:52:19.409173 1671319 ssh_runner.go:195] Run: sudo crictl images --output json
	I1124 08:52:19.432892 1671319 containerd.go:627] all images are preloaded for containerd runtime.
	I1124 08:52:19.432903 1671319 cache_images.go:86] Images are preloaded, skipping loading
	I1124 08:52:19.432910 1671319 kubeadm.go:935] updating node { 192.168.49.2 8443 v1.34.2 containerd true true} ...
	I1124 08:52:19.432997 1671319 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.34.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=dockerenv-050504 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.34.2 ClusterName:dockerenv-050504 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1124 08:52:19.433090 1671319 ssh_runner.go:195] Run: sudo crictl info
	I1124 08:52:19.458377 1671319 cni.go:84] Creating CNI manager for ""
	I1124 08:52:19.458388 1671319 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1124 08:52:19.458409 1671319 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1124 08:52:19.458430 1671319 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8443 KubernetesVersion:v1.34.2 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:dockerenv-050504 NodeName:dockerenv-050504 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPat
h:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1124 08:52:19.458610 1671319 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "dockerenv-050504"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.34.2
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1124 08:52:19.458688 1671319 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.34.2
	I1124 08:52:19.467558 1671319 binaries.go:51] Found k8s binaries, skipping transfer
	I1124 08:52:19.467627 1671319 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1124 08:52:19.475508 1671319 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (320 bytes)
	I1124 08:52:19.488296 1671319 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I1124 08:52:19.501305 1671319 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2229 bytes)
	I1124 08:52:19.514645 1671319 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1124 08:52:19.518162 1671319 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.49.2	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1124 08:52:19.528083 1671319 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1124 08:52:19.650183 1671319 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1124 08:52:19.670997 1671319 certs.go:69] Setting up /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/dockerenv-050504 for IP: 192.168.49.2
	I1124 08:52:19.671008 1671319 certs.go:195] generating shared ca certs ...
	I1124 08:52:19.671033 1671319 certs.go:227] acquiring lock for ca certs: {Name:mkbe540a30c4376a351176f7fe6fec044d058b09 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 08:52:19.671213 1671319 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.key
	I1124 08:52:19.671268 1671319 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.key
	I1124 08:52:19.671275 1671319 certs.go:257] generating profile certs ...
	I1124 08:52:19.671354 1671319 certs.go:364] generating signed profile cert for "minikube-user": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/dockerenv-050504/client.key
	I1124 08:52:19.671365 1671319 crypto.go:68] Generating cert /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/dockerenv-050504/client.crt with IP's: []
	I1124 08:52:19.812468 1671319 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/dockerenv-050504/client.crt ...
	I1124 08:52:19.812485 1671319 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/dockerenv-050504/client.crt: {Name:mke6946f2baf53905284dbc125c8905a4aa07502 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 08:52:19.812709 1671319 crypto.go:164] Writing key to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/dockerenv-050504/client.key ...
	I1124 08:52:19.812716 1671319 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/dockerenv-050504/client.key: {Name:mke11e7d42d828d9a224470c5f9394befb405aac Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 08:52:19.812819 1671319 certs.go:364] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/dockerenv-050504/apiserver.key.1f3863f7
	I1124 08:52:19.812831 1671319 crypto.go:68] Generating cert /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/dockerenv-050504/apiserver.crt.1f3863f7 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.49.2]
	I1124 08:52:20.116889 1671319 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/dockerenv-050504/apiserver.crt.1f3863f7 ...
	I1124 08:52:20.116905 1671319 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/dockerenv-050504/apiserver.crt.1f3863f7: {Name:mkbed0bfbd74537d917229e15cec0f3f9148d0a5 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 08:52:20.117107 1671319 crypto.go:164] Writing key to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/dockerenv-050504/apiserver.key.1f3863f7 ...
	I1124 08:52:20.117115 1671319 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/dockerenv-050504/apiserver.key.1f3863f7: {Name:mk19bb80ee4c8fb50b2234f4d1346e43ae8f8cbe Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 08:52:20.117203 1671319 certs.go:382] copying /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/dockerenv-050504/apiserver.crt.1f3863f7 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/dockerenv-050504/apiserver.crt
	I1124 08:52:20.117285 1671319 certs.go:386] copying /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/dockerenv-050504/apiserver.key.1f3863f7 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/dockerenv-050504/apiserver.key
	I1124 08:52:20.117339 1671319 certs.go:364] generating signed profile cert for "aggregator": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/dockerenv-050504/proxy-client.key
	I1124 08:52:20.117350 1671319 crypto.go:68] Generating cert /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/dockerenv-050504/proxy-client.crt with IP's: []
	I1124 08:52:20.653627 1671319 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/dockerenv-050504/proxy-client.crt ...
	I1124 08:52:20.653644 1671319 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/dockerenv-050504/proxy-client.crt: {Name:mk54aee665ce5f43b4aa4b47b0939c6df58002e4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 08:52:20.653850 1671319 crypto.go:164] Writing key to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/dockerenv-050504/proxy-client.key ...
	I1124 08:52:20.653859 1671319 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/dockerenv-050504/proxy-client.key: {Name:mkb87f82f7c0000b178405d83c3ba3a3fb5d73f0 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 08:52:20.654074 1671319 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem (1671 bytes)
	I1124 08:52:20.654113 1671319 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem (1078 bytes)
	I1124 08:52:20.654137 1671319 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem (1123 bytes)
	I1124 08:52:20.654159 1671319 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem (1679 bytes)
	I1124 08:52:20.654815 1671319 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1124 08:52:20.674670 1671319 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1124 08:52:20.693727 1671319 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1124 08:52:20.711528 1671319 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1124 08:52:20.729366 1671319 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/dockerenv-050504/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1124 08:52:20.746552 1671319 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/dockerenv-050504/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I1124 08:52:20.764031 1671319 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/dockerenv-050504/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1124 08:52:20.783750 1671319 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/dockerenv-050504/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1124 08:52:20.802088 1671319 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1124 08:52:20.820263 1671319 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1124 08:52:20.833481 1671319 ssh_runner.go:195] Run: openssl version
	I1124 08:52:20.842472 1671319 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1124 08:52:20.851908 1671319 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1124 08:52:20.856008 1671319 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Nov 24 08:44 /usr/share/ca-certificates/minikubeCA.pem
	I1124 08:52:20.856066 1671319 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1124 08:52:20.900434 1671319 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1124 08:52:20.908721 1671319 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1124 08:52:20.912337 1671319 certs.go:400] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I1124 08:52:20.912379 1671319 kubeadm.go:401] StartCluster: {Name:dockerenv-050504 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.2 ClusterName:dockerenv-050504 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServe
rNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8443 KubernetesVersion:v1.34.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSH
AgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 08:52:20.912458 1671319 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1124 08:52:20.912524 1671319 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1124 08:52:20.941570 1671319 cri.go:89] found id: ""
	I1124 08:52:20.941634 1671319 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1124 08:52:20.949423 1671319 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1124 08:52:20.957168 1671319 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1124 08:52:20.957236 1671319 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1124 08:52:20.964678 1671319 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1124 08:52:20.964687 1671319 kubeadm.go:158] found existing configuration files:
	
	I1124 08:52:20.964736 1671319 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I1124 08:52:20.972636 1671319 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1124 08:52:20.972705 1671319 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1124 08:52:20.980221 1671319 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I1124 08:52:20.987463 1671319 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1124 08:52:20.987515 1671319 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1124 08:52:20.994609 1671319 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I1124 08:52:21.004131 1671319 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1124 08:52:21.004201 1671319 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1124 08:52:21.012856 1671319 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I1124 08:52:21.021080 1671319 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1124 08:52:21.021143 1671319 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1124 08:52:21.029132 1671319 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.34.2:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1124 08:52:21.073393 1671319 kubeadm.go:319] [init] Using Kubernetes version: v1.34.2
	I1124 08:52:21.073693 1671319 kubeadm.go:319] [preflight] Running pre-flight checks
	I1124 08:52:21.105387 1671319 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1124 08:52:21.105466 1671319 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1124 08:52:21.105505 1671319 kubeadm.go:319] OS: Linux
	I1124 08:52:21.105549 1671319 kubeadm.go:319] CGROUPS_CPU: enabled
	I1124 08:52:21.105596 1671319 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1124 08:52:21.105642 1671319 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1124 08:52:21.105688 1671319 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1124 08:52:21.105735 1671319 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1124 08:52:21.105784 1671319 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1124 08:52:21.105828 1671319 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1124 08:52:21.105874 1671319 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1124 08:52:21.105919 1671319 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1124 08:52:21.191779 1671319 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1124 08:52:21.191881 1671319 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1124 08:52:21.191970 1671319 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1124 08:52:21.197389 1671319 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1124 08:52:21.204000 1671319 out.go:252]   - Generating certificates and keys ...
	I1124 08:52:21.204116 1671319 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1124 08:52:21.204189 1671319 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1124 08:52:21.492533 1671319 kubeadm.go:319] [certs] Generating "apiserver-kubelet-client" certificate and key
	I1124 08:52:21.622980 1671319 kubeadm.go:319] [certs] Generating "front-proxy-ca" certificate and key
	I1124 08:52:21.961521 1671319 kubeadm.go:319] [certs] Generating "front-proxy-client" certificate and key
	I1124 08:52:22.397294 1671319 kubeadm.go:319] [certs] Generating "etcd/ca" certificate and key
	I1124 08:52:22.665091 1671319 kubeadm.go:319] [certs] Generating "etcd/server" certificate and key
	I1124 08:52:22.665380 1671319 kubeadm.go:319] [certs] etcd/server serving cert is signed for DNS names [dockerenv-050504 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	I1124 08:52:23.035049 1671319 kubeadm.go:319] [certs] Generating "etcd/peer" certificate and key
	I1124 08:52:23.035341 1671319 kubeadm.go:319] [certs] etcd/peer serving cert is signed for DNS names [dockerenv-050504 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	I1124 08:52:23.173488 1671319 kubeadm.go:319] [certs] Generating "etcd/healthcheck-client" certificate and key
	I1124 08:52:23.679956 1671319 kubeadm.go:319] [certs] Generating "apiserver-etcd-client" certificate and key
	I1124 08:52:24.431040 1671319 kubeadm.go:319] [certs] Generating "sa" key and public key
	I1124 08:52:24.431271 1671319 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1124 08:52:24.868622 1671319 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1124 08:52:26.251530 1671319 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1124 08:52:26.341778 1671319 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1124 08:52:26.936577 1671319 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1124 08:52:27.395108 1671319 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1124 08:52:27.396758 1671319 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1124 08:52:27.401154 1671319 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1124 08:52:27.404625 1671319 out.go:252]   - Booting up control plane ...
	I1124 08:52:27.404746 1671319 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1124 08:52:27.404885 1671319 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1124 08:52:27.404954 1671319 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1124 08:52:27.420549 1671319 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1124 08:52:27.420659 1671319 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1124 08:52:27.428173 1671319 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1124 08:52:27.428515 1671319 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1124 08:52:27.428720 1671319 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1124 08:52:27.564737 1671319 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1124 08:52:27.564866 1671319 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1124 08:52:28.566214 1671319 kubeadm.go:319] [kubelet-check] The kubelet is healthy after 1.001766748s
	I1124 08:52:28.571128 1671319 kubeadm.go:319] [control-plane-check] Waiting for healthy control plane components. This can take up to 4m0s
	I1124 08:52:28.571222 1671319 kubeadm.go:319] [control-plane-check] Checking kube-apiserver at https://192.168.49.2:8443/livez
	I1124 08:52:28.571589 1671319 kubeadm.go:319] [control-plane-check] Checking kube-controller-manager at https://127.0.0.1:10257/healthz
	I1124 08:52:28.571688 1671319 kubeadm.go:319] [control-plane-check] Checking kube-scheduler at https://127.0.0.1:10259/livez
	I1124 08:52:34.417795 1671319 kubeadm.go:319] [control-plane-check] kube-controller-manager is healthy after 5.843955884s
	I1124 08:52:35.332708 1671319 kubeadm.go:319] [control-plane-check] kube-scheduler is healthy after 6.75853309s
	I1124 08:52:36.575248 1671319 kubeadm.go:319] [control-plane-check] kube-apiserver is healthy after 8.00249306s
	I1124 08:52:36.609293 1671319 kubeadm.go:319] [upload-config] Storing the configuration used in ConfigMap "kubeadm-config" in the "kube-system" Namespace
	I1124 08:52:36.632490 1671319 kubeadm.go:319] [kubelet] Creating a ConfigMap "kubelet-config" in namespace kube-system with the configuration for the kubelets in the cluster
	I1124 08:52:36.650668 1671319 kubeadm.go:319] [upload-certs] Skipping phase. Please see --upload-certs
	I1124 08:52:36.651086 1671319 kubeadm.go:319] [mark-control-plane] Marking the node dockerenv-050504 as control-plane by adding the labels: [node-role.kubernetes.io/control-plane node.kubernetes.io/exclude-from-external-load-balancers]
	I1124 08:52:36.671319 1671319 kubeadm.go:319] [bootstrap-token] Using token: slozb6.bwm3ncha3z0mzhvf
	I1124 08:52:36.674380 1671319 out.go:252]   - Configuring RBAC rules ...
	I1124 08:52:36.674557 1671319 kubeadm.go:319] [bootstrap-token] Configuring bootstrap tokens, cluster-info ConfigMap, RBAC Roles
	I1124 08:52:36.681667 1671319 kubeadm.go:319] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to get nodes
	I1124 08:52:36.700038 1671319 kubeadm.go:319] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to post CSRs in order for nodes to get long term certificate credentials
	I1124 08:52:36.704569 1671319 kubeadm.go:319] [bootstrap-token] Configured RBAC rules to allow the csrapprover controller automatically approve CSRs from a Node Bootstrap Token
	I1124 08:52:36.714651 1671319 kubeadm.go:319] [bootstrap-token] Configured RBAC rules to allow certificate rotation for all node client certificates in the cluster
	I1124 08:52:36.723587 1671319 kubeadm.go:319] [bootstrap-token] Creating the "cluster-info" ConfigMap in the "kube-public" namespace
	I1124 08:52:36.985297 1671319 kubeadm.go:319] [kubelet-finalize] Updating "/etc/kubernetes/kubelet.conf" to point to a rotatable kubelet client certificate and key
	I1124 08:52:37.427164 1671319 kubeadm.go:319] [addons] Applied essential addon: CoreDNS
	I1124 08:52:37.983542 1671319 kubeadm.go:319] [addons] Applied essential addon: kube-proxy
	I1124 08:52:37.984852 1671319 kubeadm.go:319] 
	I1124 08:52:37.984946 1671319 kubeadm.go:319] Your Kubernetes control-plane has initialized successfully!
	I1124 08:52:37.984954 1671319 kubeadm.go:319] 
	I1124 08:52:37.985031 1671319 kubeadm.go:319] To start using your cluster, you need to run the following as a regular user:
	I1124 08:52:37.985034 1671319 kubeadm.go:319] 
	I1124 08:52:37.985058 1671319 kubeadm.go:319]   mkdir -p $HOME/.kube
	I1124 08:52:37.985116 1671319 kubeadm.go:319]   sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config
	I1124 08:52:37.985165 1671319 kubeadm.go:319]   sudo chown $(id -u):$(id -g) $HOME/.kube/config
	I1124 08:52:37.985168 1671319 kubeadm.go:319] 
	I1124 08:52:37.985221 1671319 kubeadm.go:319] Alternatively, if you are the root user, you can run:
	I1124 08:52:37.985227 1671319 kubeadm.go:319] 
	I1124 08:52:37.985273 1671319 kubeadm.go:319]   export KUBECONFIG=/etc/kubernetes/admin.conf
	I1124 08:52:37.985276 1671319 kubeadm.go:319] 
	I1124 08:52:37.985327 1671319 kubeadm.go:319] You should now deploy a pod network to the cluster.
	I1124 08:52:37.985401 1671319 kubeadm.go:319] Run "kubectl apply -f [podnetwork].yaml" with one of the options listed at:
	I1124 08:52:37.985468 1671319 kubeadm.go:319]   https://kubernetes.io/docs/concepts/cluster-administration/addons/
	I1124 08:52:37.985470 1671319 kubeadm.go:319] 
	I1124 08:52:37.985553 1671319 kubeadm.go:319] You can now join any number of control-plane nodes by copying certificate authorities
	I1124 08:52:37.985629 1671319 kubeadm.go:319] and service account keys on each node and then running the following as root:
	I1124 08:52:37.985632 1671319 kubeadm.go:319] 
	I1124 08:52:37.985715 1671319 kubeadm.go:319]   kubeadm join control-plane.minikube.internal:8443 --token slozb6.bwm3ncha3z0mzhvf \
	I1124 08:52:37.985818 1671319 kubeadm.go:319] 	--discovery-token-ca-cert-hash sha256:d55a4d7f583a51ce0fd49715bdb144dc7e07b5286773075ca535e70f191df377 \
	I1124 08:52:37.985837 1671319 kubeadm.go:319] 	--control-plane 
	I1124 08:52:37.985839 1671319 kubeadm.go:319] 
	I1124 08:52:37.985923 1671319 kubeadm.go:319] Then you can join any number of worker nodes by running the following on each as root:
	I1124 08:52:37.985926 1671319 kubeadm.go:319] 
	I1124 08:52:37.986011 1671319 kubeadm.go:319] kubeadm join control-plane.minikube.internal:8443 --token slozb6.bwm3ncha3z0mzhvf \
	I1124 08:52:37.986120 1671319 kubeadm.go:319] 	--discovery-token-ca-cert-hash sha256:d55a4d7f583a51ce0fd49715bdb144dc7e07b5286773075ca535e70f191df377 
	I1124 08:52:37.990646 1671319 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is in maintenance mode, please migrate to cgroups v2
	I1124 08:52:37.990868 1671319 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1124 08:52:37.990972 1671319 kubeadm.go:319] 	[WARNING Service-Kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1124 08:52:37.990986 1671319 cni.go:84] Creating CNI manager for ""
	I1124 08:52:37.990993 1671319 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1124 08:52:37.996208 1671319 out.go:179] * Configuring CNI (Container Networking Interface) ...
	I1124 08:52:37.999085 1671319 ssh_runner.go:195] Run: stat /opt/cni/bin/portmap
	I1124 08:52:38.006898 1671319 cni.go:182] applying CNI manifest using /var/lib/minikube/binaries/v1.34.2/kubectl ...
	I1124 08:52:38.006910 1671319 ssh_runner.go:362] scp memory --> /var/tmp/minikube/cni.yaml (2601 bytes)
	I1124 08:52:38.021416 1671319 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.2/kubectl apply --kubeconfig=/var/lib/minikube/kubeconfig -f /var/tmp/minikube/cni.yaml
	I1124 08:52:38.323739 1671319 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I1124 08:52:38.323873 1671319 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.2/kubectl create clusterrolebinding minikube-rbac --clusterrole=cluster-admin --serviceaccount=kube-system:default --kubeconfig=/var/lib/minikube/kubeconfig
	I1124 08:52:38.323949 1671319 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes dockerenv-050504 minikube.k8s.io/updated_at=2025_11_24T08_52_38_0700 minikube.k8s.io/version=v1.37.0 minikube.k8s.io/commit=393ee3e0b845623107dce6cda4f48ffd5c3d1811 minikube.k8s.io/name=dockerenv-050504 minikube.k8s.io/primary=true
	I1124 08:52:38.484904 1671319 ops.go:34] apiserver oom_adj: -16
	I1124 08:52:38.484935 1671319 kubeadm.go:1114] duration metric: took 161.121962ms to wait for elevateKubeSystemPrivileges
	I1124 08:52:38.484949 1671319 kubeadm.go:403] duration metric: took 17.572575419s to StartCluster
	I1124 08:52:38.484964 1671319 settings.go:142] acquiring lock: {Name:mk6c04793f5fd4f38f92abf4357247f2ccd7fc4e Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 08:52:38.485019 1671319 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 08:52:38.485622 1671319 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/kubeconfig: {Name:mk02121ae6148bede61eabf0ed4e1826024715f4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 08:52:38.485814 1671319 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.49.2 Port:8443 KubernetesVersion:v1.34.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1124 08:52:38.485890 1671319 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.34.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml"
	I1124 08:52:38.486109 1671319 config.go:182] Loaded profile config "dockerenv-050504": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
	I1124 08:52:38.486141 1671319 addons.go:527] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I1124 08:52:38.486194 1671319 addons.go:70] Setting storage-provisioner=true in profile "dockerenv-050504"
	I1124 08:52:38.486207 1671319 addons.go:239] Setting addon storage-provisioner=true in "dockerenv-050504"
	I1124 08:52:38.486227 1671319 host.go:66] Checking if "dockerenv-050504" exists ...
	I1124 08:52:38.486738 1671319 cli_runner.go:164] Run: docker container inspect dockerenv-050504 --format={{.State.Status}}
	I1124 08:52:38.486974 1671319 addons.go:70] Setting default-storageclass=true in profile "dockerenv-050504"
	I1124 08:52:38.486988 1671319 addons_storage_classes.go:34] enableOrDisableStorageClasses default-storageclass=true on "dockerenv-050504"
	I1124 08:52:38.487237 1671319 cli_runner.go:164] Run: docker container inspect dockerenv-050504 --format={{.State.Status}}
	I1124 08:52:38.490142 1671319 out.go:179] * Verifying Kubernetes components...
	I1124 08:52:38.493248 1671319 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1124 08:52:38.521905 1671319 addons.go:239] Setting addon default-storageclass=true in "dockerenv-050504"
	I1124 08:52:38.521931 1671319 host.go:66] Checking if "dockerenv-050504" exists ...
	I1124 08:52:38.522350 1671319 cli_runner.go:164] Run: docker container inspect dockerenv-050504 --format={{.State.Status}}
	I1124 08:52:38.539866 1671319 out.go:179]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1124 08:52:38.542793 1671319 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 08:52:38.542805 1671319 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1124 08:52:38.542869 1671319 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" dockerenv-050504
	I1124 08:52:38.578069 1671319 addons.go:436] installing /etc/kubernetes/addons/storageclass.yaml
	I1124 08:52:38.578084 1671319 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1124 08:52:38.578147 1671319 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" dockerenv-050504
	I1124 08:52:38.604684 1671319 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34669 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/dockerenv-050504/id_rsa Username:docker}
	I1124 08:52:38.613288 1671319 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34669 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/dockerenv-050504/id_rsa Username:docker}
	I1124 08:52:38.771083 1671319 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.34.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml | sed -e '/^        forward . \/etc\/resolv.conf.*/i \        hosts {\n           192.168.49.1 host.minikube.internal\n           fallthrough\n        }' -e '/^        errors *$/i \        log' | sudo /var/lib/minikube/binaries/v1.34.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig replace -f -"
	I1124 08:52:38.791960 1671319 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1124 08:52:38.907059 1671319 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 08:52:38.950984 1671319 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1124 08:52:39.085149 1671319 start.go:977] {"host.minikube.internal": 192.168.49.1} host record injected into CoreDNS's ConfigMap
	I1124 08:52:39.086693 1671319 api_server.go:52] waiting for apiserver process to appear ...
	I1124 08:52:39.086749 1671319 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 08:52:39.414163 1671319 api_server.go:72] duration metric: took 928.325615ms to wait for apiserver process to appear ...
	I1124 08:52:39.414175 1671319 api_server.go:88] waiting for apiserver healthz status ...
	I1124 08:52:39.414190 1671319 api_server.go:253] Checking apiserver healthz at https://192.168.49.2:8443/healthz ...
	I1124 08:52:39.423532 1671319 api_server.go:279] https://192.168.49.2:8443/healthz returned 200:
	ok
	I1124 08:52:39.425716 1671319 api_server.go:141] control plane version: v1.34.2
	I1124 08:52:39.425733 1671319 api_server.go:131] duration metric: took 11.552903ms to wait for apiserver health ...
	I1124 08:52:39.425741 1671319 system_pods.go:43] waiting for kube-system pods to appear ...
	I1124 08:52:39.445837 1671319 system_pods.go:59] 5 kube-system pods found
	I1124 08:52:39.445870 1671319 system_pods.go:61] "etcd-dockerenv-050504" [766ab9bc-9bdf-4a72-8979-6cc7a5399be9] Running / Ready:ContainersNotReady (containers with unready status: [etcd]) / ContainersReady:ContainersNotReady (containers with unready status: [etcd])
	I1124 08:52:39.445881 1671319 system_pods.go:61] "kube-apiserver-dockerenv-050504" [dcc2d0e5-c423-40dc-a7f0-cbfdb2112210] Running / Ready:ContainersNotReady (containers with unready status: [kube-apiserver]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-apiserver])
	I1124 08:52:39.445888 1671319 system_pods.go:61] "kube-controller-manager-dockerenv-050504" [012caf6e-c3ef-4da6-8a9b-7517f7a47bbf] Running / Ready:ContainersNotReady (containers with unready status: [kube-controller-manager]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-controller-manager])
	I1124 08:52:39.445893 1671319 system_pods.go:61] "kube-scheduler-dockerenv-050504" [4f386d74-fa2c-4450-825a-3c91227daa24] Running / Ready:ContainersNotReady (containers with unready status: [kube-scheduler]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-scheduler])
	I1124 08:52:39.445898 1671319 system_pods.go:61] "storage-provisioner" [c515ef78-ba45-42f2-a060-8e1ccd3535e3] Pending: PodScheduled:Unschedulable (0/1 nodes are available: 1 node(s) had untolerated taint(s). no new claims to deallocate, preemption: 0/1 nodes are available: 1 Preemption is not helpful for scheduling.)
	I1124 08:52:39.445905 1671319 system_pods.go:74] duration metric: took 20.158421ms to wait for pod list to return data ...
	I1124 08:52:39.445915 1671319 kubeadm.go:587] duration metric: took 960.081789ms to wait for: map[apiserver:true system_pods:true]
	I1124 08:52:39.445926 1671319 node_conditions.go:102] verifying NodePressure condition ...
	I1124 08:52:39.452533 1671319 out.go:179] * Enabled addons: storage-provisioner, default-storageclass
	I1124 08:52:39.455568 1671319 addons.go:530] duration metric: took 969.418811ms for enable addons: enabled=[storage-provisioner default-storageclass]
	I1124 08:52:39.460543 1671319 node_conditions.go:122] node storage ephemeral capacity is 203034800Ki
	I1124 08:52:39.460561 1671319 node_conditions.go:123] node cpu capacity is 2
	I1124 08:52:39.460572 1671319 node_conditions.go:105] duration metric: took 14.642543ms to run NodePressure ...
	I1124 08:52:39.460594 1671319 start.go:242] waiting for startup goroutines ...
	I1124 08:52:39.590363 1671319 kapi.go:214] "coredns" deployment in "kube-system" namespace and "dockerenv-050504" context rescaled to 1 replicas
	I1124 08:52:39.590392 1671319 start.go:247] waiting for cluster config update ...
	I1124 08:52:39.590402 1671319 start.go:256] writing updated cluster config ...
	I1124 08:52:39.590730 1671319 ssh_runner.go:195] Run: rm -f paused
	I1124 08:52:39.653332 1671319 start.go:625] kubectl: 1.33.2, cluster: 1.34.2 (minor skew: 1)
	I1124 08:52:39.656627 1671319 out.go:179] * Done! kubectl is now configured to use "dockerenv-050504" cluster and "default" namespace by default
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                      ATTEMPT             POD ID              POD                                        NAMESPACE
	37f2bf153bc85       b1a8c6f707935       10 seconds ago      Running             kindnet-cni               0                   2fabd68b62dc9       kindnet-h8wgz                              kube-system
	1defa7ee79951       94bff1bec29fd       11 seconds ago      Running             kube-proxy                0                   566bc7ccd3471       kube-proxy-95242                           kube-system
	0527dd42d4828       b178af3d91f80       25 seconds ago      Running             kube-apiserver            0                   1c32857cf55a0       kube-apiserver-dockerenv-050504            kube-system
	8f2dcdd7d9590       4f982e73e768a       25 seconds ago      Running             kube-scheduler            0                   a9cc600fd1f5b       kube-scheduler-dockerenv-050504            kube-system
	3377b3c82aa66       1b34917560f09       25 seconds ago      Running             kube-controller-manager   0                   4f6bab1924e12       kube-controller-manager-dockerenv-050504   kube-system
	099e8f2824d37       2c5f0dedd21c2       25 seconds ago      Running             etcd                      0                   90fd72fa16378       etcd-dockerenv-050504                      kube-system
	
	
	==> containerd <==
	Nov 24 08:52:41 dockerenv-050504 containerd[755]: time="2025-11-24T08:52:41.680969404Z" level=info msg="No cni config template is specified, wait for other system components to drop the config."
	Nov 24 08:52:43 dockerenv-050504 containerd[755]: time="2025-11-24T08:52:43.076065134Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-95242,Uid:7a6d2804-1ce8-43af-9526-858d48fa05b8,Namespace:kube-system,Attempt:0,}"
	Nov 24 08:52:43 dockerenv-050504 containerd[755]: time="2025-11-24T08:52:43.088750539Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kindnet-h8wgz,Uid:f09bec34-21a5-4fef-a467-6d9928e339f8,Namespace:kube-system,Attempt:0,}"
	Nov 24 08:52:43 dockerenv-050504 containerd[755]: time="2025-11-24T08:52:43.112698481Z" level=info msg="connecting to shim 566bc7ccd34713223c61f5c686eb7d1804437add9f4f3d42bf454823b3d5cb4d" address="unix:///run/containerd/s/7ebac0030940bd55c7d9745c77edcfeac9d4ab50f53ebf3025113994f52dd900" namespace=k8s.io protocol=ttrpc version=3
	Nov 24 08:52:43 dockerenv-050504 containerd[755]: time="2025-11-24T08:52:43.127841953Z" level=info msg="connecting to shim 2fabd68b62dc98e2ee740c711171b661e5ed83b3c65dd8c6491e6390d0b07be3" address="unix:///run/containerd/s/77c10ad4c12f2a14c3f6ecbace0f5159ce6fe86cb40711d91b1146f4b368f55f" namespace=k8s.io protocol=ttrpc version=3
	Nov 24 08:52:43 dockerenv-050504 containerd[755]: time="2025-11-24T08:52:43.166699395Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-95242,Uid:7a6d2804-1ce8-43af-9526-858d48fa05b8,Namespace:kube-system,Attempt:0,} returns sandbox id \"566bc7ccd34713223c61f5c686eb7d1804437add9f4f3d42bf454823b3d5cb4d\""
	Nov 24 08:52:43 dockerenv-050504 containerd[755]: time="2025-11-24T08:52:43.179926149Z" level=info msg="CreateContainer within sandbox \"566bc7ccd34713223c61f5c686eb7d1804437add9f4f3d42bf454823b3d5cb4d\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}"
	Nov 24 08:52:43 dockerenv-050504 containerd[755]: time="2025-11-24T08:52:43.195896331Z" level=info msg="Container 1defa7ee799510ec3bcb65810a3bbdb06d531803d3972512eddac2dc3ae78f7a: CDI devices from CRI Config.CDIDevices: []"
	Nov 24 08:52:43 dockerenv-050504 containerd[755]: time="2025-11-24T08:52:43.207732208Z" level=info msg="CreateContainer within sandbox \"566bc7ccd34713223c61f5c686eb7d1804437add9f4f3d42bf454823b3d5cb4d\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"1defa7ee799510ec3bcb65810a3bbdb06d531803d3972512eddac2dc3ae78f7a\""
	Nov 24 08:52:43 dockerenv-050504 containerd[755]: time="2025-11-24T08:52:43.210822463Z" level=info msg="StartContainer for \"1defa7ee799510ec3bcb65810a3bbdb06d531803d3972512eddac2dc3ae78f7a\""
	Nov 24 08:52:43 dockerenv-050504 containerd[755]: time="2025-11-24T08:52:43.215643152Z" level=info msg="connecting to shim 1defa7ee799510ec3bcb65810a3bbdb06d531803d3972512eddac2dc3ae78f7a" address="unix:///run/containerd/s/7ebac0030940bd55c7d9745c77edcfeac9d4ab50f53ebf3025113994f52dd900" protocol=ttrpc version=3
	Nov 24 08:52:43 dockerenv-050504 containerd[755]: time="2025-11-24T08:52:43.249856637Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kindnet-h8wgz,Uid:f09bec34-21a5-4fef-a467-6d9928e339f8,Namespace:kube-system,Attempt:0,} returns sandbox id \"2fabd68b62dc98e2ee740c711171b661e5ed83b3c65dd8c6491e6390d0b07be3\""
	Nov 24 08:52:43 dockerenv-050504 containerd[755]: time="2025-11-24T08:52:43.260402561Z" level=info msg="CreateContainer within sandbox \"2fabd68b62dc98e2ee740c711171b661e5ed83b3c65dd8c6491e6390d0b07be3\" for container &ContainerMetadata{Name:kindnet-cni,Attempt:0,}"
	Nov 24 08:52:43 dockerenv-050504 containerd[755]: time="2025-11-24T08:52:43.272198675Z" level=info msg="Container 37f2bf153bc856610cdcdbe853580de7d497db586132381163d4e1bad699ffb4: CDI devices from CRI Config.CDIDevices: []"
	Nov 24 08:52:43 dockerenv-050504 containerd[755]: time="2025-11-24T08:52:43.283223620Z" level=info msg="CreateContainer within sandbox \"2fabd68b62dc98e2ee740c711171b661e5ed83b3c65dd8c6491e6390d0b07be3\" for &ContainerMetadata{Name:kindnet-cni,Attempt:0,} returns container id \"37f2bf153bc856610cdcdbe853580de7d497db586132381163d4e1bad699ffb4\""
	Nov 24 08:52:43 dockerenv-050504 containerd[755]: time="2025-11-24T08:52:43.287282815Z" level=info msg="StartContainer for \"37f2bf153bc856610cdcdbe853580de7d497db586132381163d4e1bad699ffb4\""
	Nov 24 08:52:43 dockerenv-050504 containerd[755]: time="2025-11-24T08:52:43.288330648Z" level=info msg="connecting to shim 37f2bf153bc856610cdcdbe853580de7d497db586132381163d4e1bad699ffb4" address="unix:///run/containerd/s/77c10ad4c12f2a14c3f6ecbace0f5159ce6fe86cb40711d91b1146f4b368f55f" protocol=ttrpc version=3
	Nov 24 08:52:43 dockerenv-050504 containerd[755]: time="2025-11-24T08:52:43.354809990Z" level=info msg="StartContainer for \"1defa7ee799510ec3bcb65810a3bbdb06d531803d3972512eddac2dc3ae78f7a\" returns successfully"
	Nov 24 08:52:43 dockerenv-050504 containerd[755]: time="2025-11-24T08:52:43.445886317Z" level=info msg="StartContainer for \"37f2bf153bc856610cdcdbe853580de7d497db586132381163d4e1bad699ffb4\" returns successfully"
	Nov 24 08:52:53 dockerenv-050504 containerd[755]: time="2025-11-24T08:52:53.827055993Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE         \"/etc/cni/net.d/10-kindnet.conflist.temp\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config"
	Nov 24 08:52:53 dockerenv-050504 containerd[755]: time="2025-11-24T08:52:53.827142689Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE         \"/etc/cni/net.d/10-kindnet.conflist.temp\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config"
	Nov 24 08:52:53 dockerenv-050504 containerd[755]: time="2025-11-24T08:52:53.827206690Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE         \"/etc/cni/net.d/10-kindnet.conflist.temp\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config"
	Nov 24 08:52:54 dockerenv-050504 containerd[755]: time="2025-11-24T08:52:54.233625352Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:storage-provisioner,Uid:c515ef78-ba45-42f2-a060-8e1ccd3535e3,Namespace:kube-system,Attempt:0,}"
	Nov 24 08:52:54 dockerenv-050504 containerd[755]: time="2025-11-24T08:52:54.237340403Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-xv4r9,Uid:3cfe63c1-7428-41b1-bd6f-7cba0e4b103b,Namespace:kube-system,Attempt:0,}"
	Nov 24 08:52:54 dockerenv-050504 containerd[755]: time="2025-11-24T08:52:54.283780216Z" level=info msg="connecting to shim 7c09a8156e3fb5768652728e3eabb8b4096e7c690f9185a2d815b4ba3f58b409" address="unix:///run/containerd/s/29c936bf8c036e62100e0bffce12516ec71ceb300dc446723a5dad294f20cc83" namespace=k8s.io protocol=ttrpc version=3
	
	
	==> describe nodes <==
	Name:               dockerenv-050504
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=arm64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=arm64
	                    kubernetes.io/hostname=dockerenv-050504
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=393ee3e0b845623107dce6cda4f48ffd5c3d1811
	                    minikube.k8s.io/name=dockerenv-050504
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2025_11_24T08_52_38_0700
	                    minikube.k8s.io/version=v1.37.0
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Mon, 24 Nov 2025 08:52:34 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  dockerenv-050504
	  AcquireTime:     <unset>
	  RenewTime:       Mon, 24 Nov 2025 08:52:47 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Mon, 24 Nov 2025 08:52:53 +0000   Mon, 24 Nov 2025 08:52:29 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Mon, 24 Nov 2025 08:52:53 +0000   Mon, 24 Nov 2025 08:52:29 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Mon, 24 Nov 2025 08:52:53 +0000   Mon, 24 Nov 2025 08:52:29 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Mon, 24 Nov 2025 08:52:53 +0000   Mon, 24 Nov 2025 08:52:53 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.49.2
	  Hostname:    dockerenv-050504
	Capacity:
	  cpu:                2
	  ephemeral-storage:  203034800Ki
	  hugepages-1Gi:      0
	  hugepages-2Mi:      0
	  hugepages-32Mi:     0
	  hugepages-64Ki:     0
	  memory:             8022296Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  203034800Ki
	  hugepages-1Gi:      0
	  hugepages-2Mi:      0
	  hugepages-32Mi:     0
	  hugepages-64Ki:     0
	  memory:             8022296Ki
	  pods:               110
	System Info:
	  Machine ID:                 7283ea1857f18f20a875c29069214c9d
	  System UUID:                3b5e69e4-86da-4fcf-ba74-91d53b895ed2
	  Boot ID:                    e6ca431c-3a35-478f-87f6-f49cc4bc8a65
	  Kernel Version:             5.15.0-1084-aws
	  OS Image:                   Debian GNU/Linux 12 (bookworm)
	  Operating System:           linux
	  Architecture:               arm64
	  Container Runtime Version:  containerd://2.1.5
	  Kubelet Version:            v1.34.2
	  Kube-Proxy Version:         
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (8 in total)
	  Namespace                   Name                                        CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                        ------------  ----------  ---------------  -------------  ---
	  kube-system                 coredns-66bc5c9577-xv4r9                    100m (5%)     0 (0%)      70Mi (0%)        170Mi (2%)     12s
	  kube-system                 etcd-dockerenv-050504                       100m (5%)     0 (0%)      100Mi (1%)       0 (0%)         17s
	  kube-system                 kindnet-h8wgz                               100m (5%)     100m (5%)   50Mi (0%)        50Mi (0%)      12s
	  kube-system                 kube-apiserver-dockerenv-050504             250m (12%)    0 (0%)      0 (0%)           0 (0%)         18s
	  kube-system                 kube-controller-manager-dockerenv-050504    200m (10%)    0 (0%)      0 (0%)           0 (0%)         17s
	  kube-system                 kube-proxy-95242                            0 (0%)        0 (0%)      0 (0%)           0 (0%)         12s
	  kube-system                 kube-scheduler-dockerenv-050504             100m (5%)     0 (0%)      0 (0%)           0 (0%)         17s
	  kube-system                 storage-provisioner                         0 (0%)        0 (0%)      0 (0%)           0 (0%)         15s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                850m (42%)  100m (5%)
	  memory             220Mi (2%)  220Mi (2%)
	  ephemeral-storage  0 (0%)      0 (0%)
	  hugepages-1Gi      0 (0%)      0 (0%)
	  hugepages-2Mi      0 (0%)      0 (0%)
	  hugepages-32Mi     0 (0%)      0 (0%)
	  hugepages-64Ki     0 (0%)      0 (0%)
	Events:
	  Type     Reason                   Age                From             Message
	  ----     ------                   ----               ----             -------
	  Normal   Starting                 10s                kube-proxy       
	  Normal   NodeAllocatableEnforced  26s                kubelet          Updated Node Allocatable limit across pods
	  Warning  CgroupV1                 26s                kubelet          cgroup v1 support is in maintenance mode, please migrate to cgroup v2
	  Normal   NodeHasSufficientMemory  26s (x8 over 26s)  kubelet          Node dockerenv-050504 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    26s (x8 over 26s)  kubelet          Node dockerenv-050504 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     26s (x7 over 26s)  kubelet          Node dockerenv-050504 status is now: NodeHasSufficientPID
	  Normal   Starting                 26s                kubelet          Starting kubelet.
	  Warning  CgroupV1                 17s                kubelet          cgroup v1 support is in maintenance mode, please migrate to cgroup v2
	  Normal   Starting                 17s                kubelet          Starting kubelet.
	  Normal   NodeAllocatableEnforced  17s                kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientMemory  17s                kubelet          Node dockerenv-050504 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    17s                kubelet          Node dockerenv-050504 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     17s                kubelet          Node dockerenv-050504 status is now: NodeHasSufficientPID
	  Normal   RegisteredNode           13s                node-controller  Node dockerenv-050504 event: Registered Node dockerenv-050504 in Controller
	  Normal   NodeReady                1s                 kubelet          Node dockerenv-050504 status is now: NodeReady
	
	
	==> dmesg <==
	[Nov24 08:20] overlayfs: idmapped layers are currently not supported
	[Nov24 08:21] overlayfs: idmapped layers are currently not supported
	[Nov24 08:22] overlayfs: idmapped layers are currently not supported
	[Nov24 08:23] overlayfs: idmapped layers are currently not supported
	[Nov24 08:24] overlayfs: idmapped layers are currently not supported
	[ +19.566509] overlayfs: idmapped layers are currently not supported
	[Nov24 08:25] overlayfs: idmapped layers are currently not supported
	[ +35.934095] overlayfs: idmapped layers are currently not supported
	[Nov24 08:27] overlayfs: idmapped layers are currently not supported
	[Nov24 08:28] overlayfs: idmapped layers are currently not supported
	[Nov24 08:29] overlayfs: idmapped layers are currently not supported
	[Nov24 08:30] overlayfs: idmapped layers are currently not supported
	[Nov24 08:31] overlayfs: idmapped layers are currently not supported
	[ +44.505180] overlayfs: idmapped layers are currently not supported
	[Nov24 08:32] overlayfs: idmapped layers are currently not supported
	[Nov24 08:33] overlayfs: idmapped layers are currently not supported
	[Nov24 08:34] overlayfs: idmapped layers are currently not supported
	[ +21.137877] overlayfs: idmapped layers are currently not supported
	[ +28.724175] overlayfs: idmapped layers are currently not supported
	[Nov24 08:36] overlayfs: idmapped layers are currently not supported
	[Nov24 08:37] overlayfs: idmapped layers are currently not supported
	[Nov24 08:38] overlayfs: idmapped layers are currently not supported
	[  +4.063834] overlayfs: idmapped layers are currently not supported
	[Nov24 08:40] overlayfs: idmapped layers are currently not supported
	[Nov24 08:42] kauditd_printk_skb: 8 callbacks suppressed
	
	
	==> etcd [099e8f2824d371f2278d9770f0a5cda3c248a7e0b820877d0ea9ab238d0fcb7e] <==
	{"level":"warn","ts":"2025-11-24T08:52:32.348904Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:44198","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:52:32.394203Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:44210","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:52:32.428401Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:44242","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:52:32.451598Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:44236","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:52:32.460474Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:44252","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:52:32.481438Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:44276","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:52:32.518756Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:44296","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:52:32.540620Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:44312","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:52:32.584771Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:44328","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:52:32.629240Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:44348","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:52:32.656651Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:44362","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:52:32.698829Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:44388","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:52:32.717434Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:44408","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:52:32.753622Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:44430","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:52:32.774643Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:44460","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:52:32.806018Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:44486","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:52:32.844044Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:44492","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:52:32.861567Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:44514","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:52:32.894772Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:44542","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:52:32.930314Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:44564","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:52:32.955808Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:44582","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:52:32.978627Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:44600","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:52:33.006755Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:44614","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:52:33.038743Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:44640","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:52:33.199158Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:44652","server-name":"","error":"EOF"}
	
	
	==> kernel <==
	 08:52:54 up  7:35,  0 user,  load average: 1.21, 1.18, 2.02
	Linux dockerenv-050504 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kindnet [37f2bf153bc856610cdcdbe853580de7d497db586132381163d4e1bad699ffb4] <==
	I1124 08:52:43.623119       1 main.go:109] connected to apiserver: https://10.96.0.1:443
	I1124 08:52:43.623746       1 main.go:139] hostIP = 192.168.49.2
	podIP = 192.168.49.2
	I1124 08:52:43.623998       1 main.go:148] setting mtu 1500 for CNI 
	I1124 08:52:43.624071       1 main.go:178] kindnetd IP family: "ipv4"
	I1124 08:52:43.624138       1 main.go:182] noMask IPv4 subnets: [10.244.0.0/16]
	time="2025-11-24T08:52:43Z" level=info msg="Created plugin 10-kube-network-policies (kindnetd, handles RunPodSandbox,RemovePodSandbox)"
	I1124 08:52:43.823766       1 controller.go:377] "Starting controller" name="kube-network-policies"
	I1124 08:52:43.918521       1 controller.go:381] "Waiting for informer caches to sync"
	I1124 08:52:43.918620       1 shared_informer.go:350] "Waiting for caches to sync" controller="kube-network-policies"
	I1124 08:52:43.919625       1 controller.go:390] nri plugin exited: failed to connect to NRI service: dial unix /var/run/nri/nri.sock: connect: no such file or directory
	I1124 08:52:44.118822       1 shared_informer.go:357] "Caches are synced" controller="kube-network-policies"
	I1124 08:52:44.118853       1 metrics.go:72] Registering metrics
	I1124 08:52:44.119120       1 controller.go:711] "Syncing nftables rules"
	I1124 08:52:53.826545       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 08:52:53.826603       1 main.go:301] handling current node
	
	
	==> kube-apiserver [0527dd42d4828e42b6493849f1a29bf4f2e91a6e459a3640b6e4f58400228ca6] <==
	I1124 08:52:34.738302       1 cache.go:32] Waiting for caches to sync for autoregister controller
	I1124 08:52:34.738333       1 cache.go:39] Caches are synced for autoregister controller
	I1124 08:52:34.746267       1 controller.go:667] quota admission added evaluator for: namespaces
	I1124 08:52:34.761459       1 cidrallocator.go:301] created ClusterIP allocator for Service CIDR 10.96.0.0/12
	I1124 08:52:34.761729       1 default_servicecidr_controller.go:228] Setting default ServiceCIDR condition Ready to True
	I1124 08:52:34.786302       1 cidrallocator.go:277] updated ClusterIP allocator for Service CIDR 10.96.0.0/12
	I1124 08:52:34.792849       1 default_servicecidr_controller.go:137] Shutting down kubernetes-service-cidr-controller
	I1124 08:52:34.802708       1 controller.go:667] quota admission added evaluator for: leases.coordination.k8s.io
	I1124 08:52:35.427842       1 storage_scheduling.go:95] created PriorityClass system-node-critical with value 2000001000
	I1124 08:52:35.436373       1 storage_scheduling.go:95] created PriorityClass system-cluster-critical with value 2000000000
	I1124 08:52:35.436569       1 storage_scheduling.go:111] all system priority classes are created successfully or already exist.
	I1124 08:52:36.193924       1 controller.go:667] quota admission added evaluator for: roles.rbac.authorization.k8s.io
	I1124 08:52:36.251960       1 controller.go:667] quota admission added evaluator for: rolebindings.rbac.authorization.k8s.io
	I1124 08:52:36.364072       1 alloc.go:328] "allocated clusterIPs" service="default/kubernetes" clusterIPs={"IPv4":"10.96.0.1"}
	W1124 08:52:36.373730       1 lease.go:265] Resetting endpoints for master service "kubernetes" to [192.168.49.2]
	I1124 08:52:36.375161       1 controller.go:667] quota admission added evaluator for: endpoints
	I1124 08:52:36.381386       1 controller.go:667] quota admission added evaluator for: endpointslices.discovery.k8s.io
	I1124 08:52:36.637199       1 controller.go:667] quota admission added evaluator for: serviceaccounts
	I1124 08:52:37.406034       1 controller.go:667] quota admission added evaluator for: deployments.apps
	I1124 08:52:37.421181       1 alloc.go:328] "allocated clusterIPs" service="kube-system/kube-dns" clusterIPs={"IPv4":"10.96.0.10"}
	I1124 08:52:37.433389       1 controller.go:667] quota admission added evaluator for: daemonsets.apps
	I1124 08:52:42.288156       1 cidrallocator.go:277] updated ClusterIP allocator for Service CIDR 10.96.0.0/12
	I1124 08:52:42.297531       1 cidrallocator.go:277] updated ClusterIP allocator for Service CIDR 10.96.0.0/12
	I1124 08:52:42.332080       1 controller.go:667] quota admission added evaluator for: replicasets.apps
	I1124 08:52:42.731677       1 controller.go:667] quota admission added evaluator for: controllerrevisions.apps
	
	
	==> kube-controller-manager [3377b3c82aa66e8709fed91abfc270439e0f52062ed12fb744c10af71bb439da] <==
	I1124 08:52:41.639020       1 shared_informer.go:356] "Caches are synced" controller="disruption"
	I1124 08:52:41.639273       1 shared_informer.go:356] "Caches are synced" controller="resource quota"
	I1124 08:52:41.644549       1 shared_informer.go:356] "Caches are synced" controller="ephemeral"
	I1124 08:52:41.648073       1 shared_informer.go:356] "Caches are synced" controller="certificate-csrsigning-kubelet-serving"
	I1124 08:52:41.649295       1 shared_informer.go:356] "Caches are synced" controller="certificate-csrsigning-kubelet-client"
	I1124 08:52:41.650036       1 shared_informer.go:356] "Caches are synced" controller="certificate-csrsigning-kube-apiserver-client"
	I1124 08:52:41.651179       1 shared_informer.go:356] "Caches are synced" controller="certificate-csrsigning-legacy-unknown"
	I1124 08:52:41.654534       1 shared_informer.go:356] "Caches are synced" controller="certificate-csrapproving"
	I1124 08:52:41.654550       1 shared_informer.go:356] "Caches are synced" controller="garbage collector"
	I1124 08:52:41.655683       1 shared_informer.go:356] "Caches are synced" controller="resource quota"
	I1124 08:52:41.660868       1 shared_informer.go:356] "Caches are synced" controller="endpoint_slice"
	I1124 08:52:41.669304       1 shared_informer.go:356] "Caches are synced" controller="TTL"
	I1124 08:52:41.673808       1 shared_informer.go:356] "Caches are synced" controller="HPA"
	I1124 08:52:41.676279       1 shared_informer.go:356] "Caches are synced" controller="taint-eviction-controller"
	I1124 08:52:41.676288       1 shared_informer.go:356] "Caches are synced" controller="expand"
	I1124 08:52:41.676518       1 shared_informer.go:356] "Caches are synced" controller="persistent volume"
	I1124 08:52:41.676612       1 shared_informer.go:356] "Caches are synced" controller="endpoint"
	I1124 08:52:41.676710       1 shared_informer.go:356] "Caches are synced" controller="GC"
	I1124 08:52:41.676791       1 shared_informer.go:356] "Caches are synced" controller="daemon sets"
	I1124 08:52:41.676316       1 shared_informer.go:356] "Caches are synced" controller="attach detach"
	I1124 08:52:41.676302       1 shared_informer.go:356] "Caches are synced" controller="crt configmap"
	I1124 08:52:41.676933       1 shared_informer.go:356] "Caches are synced" controller="bootstrap_signer"
	I1124 08:52:41.683987       1 shared_informer.go:356] "Caches are synced" controller="garbage collector"
	I1124 08:52:41.684011       1 garbagecollector.go:154] "Garbage collector: all resource monitors have synced" logger="garbage-collector-controller"
	I1124 08:52:41.684019       1 garbagecollector.go:157] "Proceeding to collect garbage" logger="garbage-collector-controller"
	
	
	==> kube-proxy [1defa7ee799510ec3bcb65810a3bbdb06d531803d3972512eddac2dc3ae78f7a] <==
	I1124 08:52:43.396424       1 server_linux.go:53] "Using iptables proxy"
	I1124 08:52:43.485934       1 shared_informer.go:349] "Waiting for caches to sync" controller="node informer cache"
	I1124 08:52:43.586837       1 shared_informer.go:356] "Caches are synced" controller="node informer cache"
	I1124 08:52:43.586896       1 server.go:219] "Successfully retrieved NodeIPs" NodeIPs=["192.168.49.2"]
	E1124 08:52:43.586994       1 server.go:256] "Kube-proxy configuration may be incomplete or incorrect" err="nodePortAddresses is unset; NodePort connections will be accepted on all local IPs. Consider using `--nodeport-addresses primary`"
	I1124 08:52:43.606989       1 server.go:265] "kube-proxy running in dual-stack mode" primary ipFamily="IPv4"
	I1124 08:52:43.607222       1 server_linux.go:132] "Using iptables Proxier"
	I1124 08:52:43.611277       1 proxier.go:242] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses" ipFamily="IPv4"
	I1124 08:52:43.611755       1 server.go:527] "Version info" version="v1.34.2"
	I1124 08:52:43.612029       1 server.go:529] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I1124 08:52:43.613890       1 config.go:200] "Starting service config controller"
	I1124 08:52:43.614064       1 shared_informer.go:349] "Waiting for caches to sync" controller="service config"
	I1124 08:52:43.614190       1 config.go:106] "Starting endpoint slice config controller"
	I1124 08:52:43.614262       1 shared_informer.go:349] "Waiting for caches to sync" controller="endpoint slice config"
	I1124 08:52:43.614357       1 config.go:403] "Starting serviceCIDR config controller"
	I1124 08:52:43.614420       1 shared_informer.go:349] "Waiting for caches to sync" controller="serviceCIDR config"
	I1124 08:52:43.618403       1 config.go:309] "Starting node config controller"
	I1124 08:52:43.618603       1 shared_informer.go:349] "Waiting for caches to sync" controller="node config"
	I1124 08:52:43.618673       1 shared_informer.go:356] "Caches are synced" controller="node config"
	I1124 08:52:43.714554       1 shared_informer.go:356] "Caches are synced" controller="endpoint slice config"
	I1124 08:52:43.714782       1 shared_informer.go:356] "Caches are synced" controller="service config"
	I1124 08:52:43.714828       1 shared_informer.go:356] "Caches are synced" controller="serviceCIDR config"
	
	
	==> kube-scheduler [8f2dcdd7d9590c10656d41a2a9890ad8e41347011b97c0d293500678e32580d5] <==
	I1124 08:52:35.313177       1 server.go:177] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I1124 08:52:35.316213       1 configmap_cafile_content.go:205] "Starting controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I1124 08:52:35.316265       1 shared_informer.go:349] "Waiting for caches to sync" controller="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I1124 08:52:35.317491       1 secure_serving.go:211] Serving securely on 127.0.0.1:10259
	I1124 08:52:35.317683       1 tlsconfig.go:243] "Starting DynamicServingCertificateController"
	E1124 08:52:35.318447       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"extension-apiserver-authentication\" is forbidden: User \"system:kube-scheduler\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"kube-system\"" logger="UnhandledError" reflector="runtime/asm_arm64.s:1223" type="*v1.ConfigMap"
	E1124 08:52:35.330864       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver"
	E1124 08:52:35.331228       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PodDisruptionBudget: poddisruptionbudgets.policy is forbidden: User \"system:kube-scheduler\" cannot list resource \"poddisruptionbudgets\" in API group \"policy\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PodDisruptionBudget"
	E1124 08:52:35.331326       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User \"system:kube-scheduler\" cannot list resource \"replicationcontrollers\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ReplicationController"
	E1124 08:52:35.331433       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIStorageCapacity: csistoragecapacities.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csistoragecapacities\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIStorageCapacity"
	E1124 08:52:35.331547       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: nodes is forbidden: User \"system:kube-scheduler\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1124 08:52:35.331639       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ResourceClaim: resourceclaims.resource.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"resourceclaims\" in API group \"resource.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ResourceClaim"
	E1124 08:52:35.331718       1 reflector.go:205] "Failed to watch" err="failed to list *v1.DeviceClass: deviceclasses.resource.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"deviceclasses\" in API group \"resource.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.DeviceClass"
	E1124 08:52:35.332666       1 reflector.go:205] "Failed to watch" err="failed to list *v1.VolumeAttachment: volumeattachments.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"volumeattachments\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.VolumeAttachment"
	E1124 08:52:35.332789       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User \"system:kube-scheduler\" cannot list resource \"persistentvolumes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolume"
	E1124 08:52:35.332242       1 reflector.go:205] "Failed to watch" err="failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User \"system:kube-scheduler\" cannot list resource \"statefulsets\" in API group \"apps\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.StatefulSet"
	E1124 08:52:35.332276       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ResourceSlice: resourceslices.resource.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"resourceslices\" in API group \"resource.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ResourceSlice"
	E1124 08:52:35.332937       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:kube-scheduler\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service"
	E1124 08:52:35.333031       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User \"system:kube-scheduler\" cannot list resource \"replicasets\" in API group \"apps\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ReplicaSet"
	E1124 08:52:35.333104       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User \"system:kube-scheduler\" cannot list resource \"persistentvolumeclaims\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolumeClaim"
	E1124 08:52:35.333175       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Pod: pods is forbidden: User \"system:kube-scheduler\" cannot list resource \"pods\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Pod"
	E1124 08:52:35.333252       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csinodes\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSINode"
	E1124 08:52:35.332147       1 reflector.go:205] "Failed to watch" err="failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"storageclasses\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.StorageClass"
	E1124 08:52:35.332205       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Namespace: namespaces is forbidden: User \"system:kube-scheduler\" cannot list resource \"namespaces\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Namespace"
	I1124 08:52:36.317355       1 shared_informer.go:356] "Caches are synced" controller="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	
	
	==> kubelet <==
	Nov 24 08:52:38 dockerenv-050504 kubelet[1435]: I1124 08:52:38.373255    1435 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world"
	Nov 24 08:52:38 dockerenv-050504 kubelet[1435]: I1124 08:52:38.468062    1435 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/etcd-dockerenv-050504"
	Nov 24 08:52:38 dockerenv-050504 kubelet[1435]: E1124 08:52:38.483188    1435 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"etcd-dockerenv-050504\" already exists" pod="kube-system/etcd-dockerenv-050504"
	Nov 24 08:52:38 dockerenv-050504 kubelet[1435]: I1124 08:52:38.496691    1435 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-dockerenv-050504" podStartSLOduration=1.496670647 podStartE2EDuration="1.496670647s" podCreationTimestamp="2025-11-24 08:52:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 08:52:38.464490189 +0000 UTC m=+1.208759482" watchObservedRunningTime="2025-11-24 08:52:38.496670647 +0000 UTC m=+1.240939924"
	Nov 24 08:52:38 dockerenv-050504 kubelet[1435]: I1124 08:52:38.496870    1435 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-dockerenv-050504" podStartSLOduration=1.496864078 podStartE2EDuration="1.496864078s" podCreationTimestamp="2025-11-24 08:52:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 08:52:38.491727684 +0000 UTC m=+1.235996960" watchObservedRunningTime="2025-11-24 08:52:38.496864078 +0000 UTC m=+1.241133371"
	Nov 24 08:52:38 dockerenv-050504 kubelet[1435]: I1124 08:52:38.557224    1435 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/etcd-dockerenv-050504" podStartSLOduration=1.5571994660000001 podStartE2EDuration="1.557199466s" podCreationTimestamp="2025-11-24 08:52:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 08:52:38.513332539 +0000 UTC m=+1.257601824" watchObservedRunningTime="2025-11-24 08:52:38.557199466 +0000 UTC m=+1.301468743"
	Nov 24 08:52:38 dockerenv-050504 kubelet[1435]: I1124 08:52:38.623085    1435 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-dockerenv-050504" podStartSLOduration=2.623047958 podStartE2EDuration="2.623047958s" podCreationTimestamp="2025-11-24 08:52:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 08:52:38.560825022 +0000 UTC m=+1.305094307" watchObservedRunningTime="2025-11-24 08:52:38.623047958 +0000 UTC m=+1.367317252"
	Nov 24 08:52:41 dockerenv-050504 kubelet[1435]: I1124 08:52:41.680118    1435 kuberuntime_manager.go:1828] "Updating runtime config through cri with podcidr" CIDR="10.244.0.0/24"
	Nov 24 08:52:41 dockerenv-050504 kubelet[1435]: I1124 08:52:41.681471    1435 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="10.244.0.0/24"
	Nov 24 08:52:42 dockerenv-050504 kubelet[1435]: I1124 08:52:42.936385    1435 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/f09bec34-21a5-4fef-a467-6d9928e339f8-xtables-lock\") pod \"kindnet-h8wgz\" (UID: \"f09bec34-21a5-4fef-a467-6d9928e339f8\") " pod="kube-system/kindnet-h8wgz"
	Nov 24 08:52:42 dockerenv-050504 kubelet[1435]: I1124 08:52:42.936443    1435 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n75q\" (UniqueName: \"kubernetes.io/projected/f09bec34-21a5-4fef-a467-6d9928e339f8-kube-api-access-9n75q\") pod \"kindnet-h8wgz\" (UID: \"f09bec34-21a5-4fef-a467-6d9928e339f8\") " pod="kube-system/kindnet-h8wgz"
	Nov 24 08:52:42 dockerenv-050504 kubelet[1435]: I1124 08:52:42.936470    1435 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-cfg\" (UniqueName: \"kubernetes.io/host-path/f09bec34-21a5-4fef-a467-6d9928e339f8-cni-cfg\") pod \"kindnet-h8wgz\" (UID: \"f09bec34-21a5-4fef-a467-6d9928e339f8\") " pod="kube-system/kindnet-h8wgz"
	Nov 24 08:52:42 dockerenv-050504 kubelet[1435]: I1124 08:52:42.936491    1435 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fllkc\" (UniqueName: \"kubernetes.io/projected/7a6d2804-1ce8-43af-9526-858d48fa05b8-kube-api-access-fllkc\") pod \"kube-proxy-95242\" (UID: \"7a6d2804-1ce8-43af-9526-858d48fa05b8\") " pod="kube-system/kube-proxy-95242"
	Nov 24 08:52:42 dockerenv-050504 kubelet[1435]: I1124 08:52:42.936509    1435 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f09bec34-21a5-4fef-a467-6d9928e339f8-lib-modules\") pod \"kindnet-h8wgz\" (UID: \"f09bec34-21a5-4fef-a467-6d9928e339f8\") " pod="kube-system/kindnet-h8wgz"
	Nov 24 08:52:42 dockerenv-050504 kubelet[1435]: I1124 08:52:42.936525    1435 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7a6d2804-1ce8-43af-9526-858d48fa05b8-lib-modules\") pod \"kube-proxy-95242\" (UID: \"7a6d2804-1ce8-43af-9526-858d48fa05b8\") " pod="kube-system/kube-proxy-95242"
	Nov 24 08:52:42 dockerenv-050504 kubelet[1435]: I1124 08:52:42.936541    1435 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/7a6d2804-1ce8-43af-9526-858d48fa05b8-kube-proxy\") pod \"kube-proxy-95242\" (UID: \"7a6d2804-1ce8-43af-9526-858d48fa05b8\") " pod="kube-system/kube-proxy-95242"
	Nov 24 08:52:42 dockerenv-050504 kubelet[1435]: I1124 08:52:42.936557    1435 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/7a6d2804-1ce8-43af-9526-858d48fa05b8-xtables-lock\") pod \"kube-proxy-95242\" (UID: \"7a6d2804-1ce8-43af-9526-858d48fa05b8\") " pod="kube-system/kube-proxy-95242"
	Nov 24 08:52:43 dockerenv-050504 kubelet[1435]: I1124 08:52:43.048481    1435 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory"
	Nov 24 08:52:43 dockerenv-050504 kubelet[1435]: I1124 08:52:43.522976    1435 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kindnet-h8wgz" podStartSLOduration=1.5229420820000001 podStartE2EDuration="1.522942082s" podCreationTimestamp="2025-11-24 08:52:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 08:52:43.508328577 +0000 UTC m=+6.252597862" watchObservedRunningTime="2025-11-24 08:52:43.522942082 +0000 UTC m=+6.267211367"
	Nov 24 08:52:43 dockerenv-050504 kubelet[1435]: I1124 08:52:43.540265    1435 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-95242" podStartSLOduration=1.540246993 podStartE2EDuration="1.540246993s" podCreationTimestamp="2025-11-24 08:52:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 08:52:43.540013997 +0000 UTC m=+6.284283274" watchObservedRunningTime="2025-11-24 08:52:43.540246993 +0000 UTC m=+6.284516269"
	Nov 24 08:52:53 dockerenv-050504 kubelet[1435]: I1124 08:52:53.883213    1435 kubelet_node_status.go:439] "Fast updating node status as it just became ready"
	Nov 24 08:52:54 dockerenv-050504 kubelet[1435]: I1124 08:52:54.009886    1435 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3cfe63c1-7428-41b1-bd6f-7cba0e4b103b-config-volume\") pod \"coredns-66bc5c9577-xv4r9\" (UID: \"3cfe63c1-7428-41b1-bd6f-7cba0e4b103b\") " pod="kube-system/coredns-66bc5c9577-xv4r9"
	Nov 24 08:52:54 dockerenv-050504 kubelet[1435]: I1124 08:52:54.009960    1435 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/host-path/c515ef78-ba45-42f2-a060-8e1ccd3535e3-tmp\") pod \"storage-provisioner\" (UID: \"c515ef78-ba45-42f2-a060-8e1ccd3535e3\") " pod="kube-system/storage-provisioner"
	Nov 24 08:52:54 dockerenv-050504 kubelet[1435]: I1124 08:52:54.010000    1435 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cctb\" (UniqueName: \"kubernetes.io/projected/c515ef78-ba45-42f2-a060-8e1ccd3535e3-kube-api-access-6cctb\") pod \"storage-provisioner\" (UID: \"c515ef78-ba45-42f2-a060-8e1ccd3535e3\") " pod="kube-system/storage-provisioner"
	Nov 24 08:52:54 dockerenv-050504 kubelet[1435]: I1124 08:52:54.010027    1435 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfq5d\" (UniqueName: \"kubernetes.io/projected/3cfe63c1-7428-41b1-bd6f-7cba0e4b103b-kube-api-access-kfq5d\") pod \"coredns-66bc5c9577-xv4r9\" (UID: \"3cfe63c1-7428-41b1-bd6f-7cba0e4b103b\") " pod="kube-system/coredns-66bc5c9577-xv4r9"
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p dockerenv-050504 -n dockerenv-050504
helpers_test.go:269: (dbg) Run:  kubectl --context dockerenv-050504 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:280: non-running pods: coredns-66bc5c9577-xv4r9 storage-provisioner
helpers_test.go:282: ======> post-mortem[TestDockerEnvContainerd]: describe non-running pods <======
helpers_test.go:285: (dbg) Run:  kubectl --context dockerenv-050504 describe pod coredns-66bc5c9577-xv4r9 storage-provisioner
helpers_test.go:285: (dbg) Non-zero exit: kubectl --context dockerenv-050504 describe pod coredns-66bc5c9577-xv4r9 storage-provisioner: exit status 1 (140.05055ms)

                                                
                                                
** stderr ** 
	Error from server (NotFound): pods "coredns-66bc5c9577-xv4r9" not found
	Error from server (NotFound): pods "storage-provisioner" not found

                                                
                                                
** /stderr **
helpers_test.go:287: kubectl --context dockerenv-050504 describe pod coredns-66bc5c9577-xv4r9 storage-provisioner: exit status 1
helpers_test.go:175: Cleaning up "dockerenv-050504" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p dockerenv-050504
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p dockerenv-050504: (2.353550845s)
--- FAIL: TestDockerEnvContainerd (51.16s)

                                                
                                    
x
+
TestFunctional/parallel/DashboardCmd (302.68s)

                                                
                                                
=== RUN   TestFunctional/parallel/DashboardCmd
=== PAUSE TestFunctional/parallel/DashboardCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DashboardCmd
functional_test.go:920: (dbg) daemon: [out/minikube-linux-arm64 dashboard --url --port 36195 -p functional-941011 --alsologtostderr -v=1]
E1124 09:06:03.604242 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
functional_test.go:933: output didn't produce a URL
functional_test.go:925: (dbg) stopping [out/minikube-linux-arm64 dashboard --url --port 36195 -p functional-941011 --alsologtostderr -v=1] ...
functional_test.go:925: (dbg) [out/minikube-linux-arm64 dashboard --url --port 36195 -p functional-941011 --alsologtostderr -v=1] stdout:
functional_test.go:925: (dbg) [out/minikube-linux-arm64 dashboard --url --port 36195 -p functional-941011 --alsologtostderr -v=1] stderr:
I1124 09:02:11.163183 1692489 out.go:360] Setting OutFile to fd 1 ...
I1124 09:02:11.164258 1692489 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1124 09:02:11.164275 1692489 out.go:374] Setting ErrFile to fd 2...
I1124 09:02:11.164281 1692489 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1124 09:02:11.164556 1692489 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
I1124 09:02:11.164852 1692489 mustload.go:66] Loading cluster: functional-941011
I1124 09:02:11.165362 1692489 config.go:182] Loaded profile config "functional-941011": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
I1124 09:02:11.165862 1692489 cli_runner.go:164] Run: docker container inspect functional-941011 --format={{.State.Status}}
I1124 09:02:11.187706 1692489 host.go:66] Checking if "functional-941011" exists ...
I1124 09:02:11.188020 1692489 cli_runner.go:164] Run: docker system info --format "{{json .}}"
I1124 09:02:11.245750 1692489 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-11-24 09:02:11.23558437 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aar
ch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
I1124 09:02:11.245882 1692489 api_server.go:166] Checking apiserver status ...
I1124 09:02:11.245949 1692489 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
I1124 09:02:11.245998 1692489 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-941011
I1124 09:02:11.267130 1692489 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34679 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-941011/id_rsa Username:docker}
I1124 09:02:11.376888 1692489 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/4630/cgroup
I1124 09:02:11.385517 1692489 api_server.go:182] apiserver freezer: "11:freezer:/docker/d8574c2bf48ce967fe6e255a178ebf105b1a3019c24cf41f2b79f5302c127773/kubepods/burstable/pod6888344dd2fbb0d34365af48bf12ea0a/12cb72b6be32a86f06dc22816a54fbdc70bf5efe54d3eba2e90672e835fad88f"
I1124 09:02:11.385604 1692489 ssh_runner.go:195] Run: sudo cat /sys/fs/cgroup/freezer/docker/d8574c2bf48ce967fe6e255a178ebf105b1a3019c24cf41f2b79f5302c127773/kubepods/burstable/pod6888344dd2fbb0d34365af48bf12ea0a/12cb72b6be32a86f06dc22816a54fbdc70bf5efe54d3eba2e90672e835fad88f/freezer.state
I1124 09:02:11.393099 1692489 api_server.go:204] freezer state: "THAWED"
I1124 09:02:11.393129 1692489 api_server.go:253] Checking apiserver healthz at https://192.168.49.2:8441/healthz ...
I1124 09:02:11.401399 1692489 api_server.go:279] https://192.168.49.2:8441/healthz returned 200:
ok
W1124 09:02:11.401445 1692489 out.go:285] * Enabling dashboard ...
* Enabling dashboard ...
I1124 09:02:11.401663 1692489 config.go:182] Loaded profile config "functional-941011": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
I1124 09:02:11.401683 1692489 addons.go:70] Setting dashboard=true in profile "functional-941011"
I1124 09:02:11.401694 1692489 addons.go:239] Setting addon dashboard=true in "functional-941011"
I1124 09:02:11.401718 1692489 host.go:66] Checking if "functional-941011" exists ...
I1124 09:02:11.402180 1692489 cli_runner.go:164] Run: docker container inspect functional-941011 --format={{.State.Status}}
I1124 09:02:11.422979 1692489 out.go:179]   - Using image docker.io/kubernetesui/dashboard:v2.7.0
I1124 09:02:11.425833 1692489 out.go:179]   - Using image docker.io/kubernetesui/metrics-scraper:v1.0.8
I1124 09:02:11.428701 1692489 addons.go:436] installing /etc/kubernetes/addons/dashboard-ns.yaml
I1124 09:02:11.428727 1692489 ssh_runner.go:362] scp dashboard/dashboard-ns.yaml --> /etc/kubernetes/addons/dashboard-ns.yaml (759 bytes)
I1124 09:02:11.428803 1692489 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-941011
I1124 09:02:11.445642 1692489 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34679 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-941011/id_rsa Username:docker}
I1124 09:02:11.555688 1692489 addons.go:436] installing /etc/kubernetes/addons/dashboard-clusterrole.yaml
I1124 09:02:11.555734 1692489 ssh_runner.go:362] scp dashboard/dashboard-clusterrole.yaml --> /etc/kubernetes/addons/dashboard-clusterrole.yaml (1001 bytes)
I1124 09:02:11.568838 1692489 addons.go:436] installing /etc/kubernetes/addons/dashboard-clusterrolebinding.yaml
I1124 09:02:11.568889 1692489 ssh_runner.go:362] scp dashboard/dashboard-clusterrolebinding.yaml --> /etc/kubernetes/addons/dashboard-clusterrolebinding.yaml (1018 bytes)
I1124 09:02:11.584547 1692489 addons.go:436] installing /etc/kubernetes/addons/dashboard-configmap.yaml
I1124 09:02:11.584568 1692489 ssh_runner.go:362] scp dashboard/dashboard-configmap.yaml --> /etc/kubernetes/addons/dashboard-configmap.yaml (837 bytes)
I1124 09:02:11.597859 1692489 addons.go:436] installing /etc/kubernetes/addons/dashboard-dp.yaml
I1124 09:02:11.597883 1692489 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/dashboard-dp.yaml (4288 bytes)
I1124 09:02:11.611714 1692489 addons.go:436] installing /etc/kubernetes/addons/dashboard-role.yaml
I1124 09:02:11.611737 1692489 ssh_runner.go:362] scp dashboard/dashboard-role.yaml --> /etc/kubernetes/addons/dashboard-role.yaml (1724 bytes)
I1124 09:02:11.624680 1692489 addons.go:436] installing /etc/kubernetes/addons/dashboard-rolebinding.yaml
I1124 09:02:11.624702 1692489 ssh_runner.go:362] scp dashboard/dashboard-rolebinding.yaml --> /etc/kubernetes/addons/dashboard-rolebinding.yaml (1046 bytes)
I1124 09:02:11.639927 1692489 addons.go:436] installing /etc/kubernetes/addons/dashboard-sa.yaml
I1124 09:02:11.639970 1692489 ssh_runner.go:362] scp dashboard/dashboard-sa.yaml --> /etc/kubernetes/addons/dashboard-sa.yaml (837 bytes)
I1124 09:02:11.652928 1692489 addons.go:436] installing /etc/kubernetes/addons/dashboard-secret.yaml
I1124 09:02:11.652979 1692489 ssh_runner.go:362] scp dashboard/dashboard-secret.yaml --> /etc/kubernetes/addons/dashboard-secret.yaml (1389 bytes)
I1124 09:02:11.666786 1692489 addons.go:436] installing /etc/kubernetes/addons/dashboard-svc.yaml
I1124 09:02:11.666833 1692489 ssh_runner.go:362] scp dashboard/dashboard-svc.yaml --> /etc/kubernetes/addons/dashboard-svc.yaml (1294 bytes)
I1124 09:02:11.681512 1692489 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply -f /etc/kubernetes/addons/dashboard-ns.yaml -f /etc/kubernetes/addons/dashboard-clusterrole.yaml -f /etc/kubernetes/addons/dashboard-clusterrolebinding.yaml -f /etc/kubernetes/addons/dashboard-configmap.yaml -f /etc/kubernetes/addons/dashboard-dp.yaml -f /etc/kubernetes/addons/dashboard-role.yaml -f /etc/kubernetes/addons/dashboard-rolebinding.yaml -f /etc/kubernetes/addons/dashboard-sa.yaml -f /etc/kubernetes/addons/dashboard-secret.yaml -f /etc/kubernetes/addons/dashboard-svc.yaml
I1124 09:02:12.472270 1692489 out.go:179] * Some dashboard features require the metrics-server addon. To enable all features please run:

                                                
                                                
	minikube -p functional-941011 addons enable metrics-server

                                                
                                                
I1124 09:02:12.475157 1692489 addons.go:202] Writing out "functional-941011" config to set dashboard=true...
W1124 09:02:12.475439 1692489 out.go:285] * Verifying dashboard health ...
* Verifying dashboard health ...
I1124 09:02:12.476101 1692489 kapi.go:59] client config for functional-941011: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/client.crt", KeyFile:"/home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/client.key", CAFile:"/home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil
), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb2df0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
I1124 09:02:12.476638 1692489 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
I1124 09:02:12.476658 1692489 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
I1124 09:02:12.476665 1692489 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
I1124 09:02:12.476679 1692489 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
I1124 09:02:12.476685 1692489 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
I1124 09:02:12.492721 1692489 service.go:215] Found service: &Service{ObjectMeta:{kubernetes-dashboard  kubernetes-dashboard  7542375c-73ac-47d7-ac04-a96e1d729b7c 1174 0 2025-11-24 09:02:12 +0000 UTC <nil> <nil> map[addonmanager.kubernetes.io/mode:Reconcile k8s-app:kubernetes-dashboard kubernetes.io/minikube-addons:dashboard] map[kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"v1","kind":"Service","metadata":{"annotations":{},"labels":{"addonmanager.kubernetes.io/mode":"Reconcile","k8s-app":"kubernetes-dashboard","kubernetes.io/minikube-addons":"dashboard"},"name":"kubernetes-dashboard","namespace":"kubernetes-dashboard"},"spec":{"ports":[{"port":80,"targetPort":9090}],"selector":{"k8s-app":"kubernetes-dashboard"}}}
] [] [] [{kubectl-client-side-apply Update v1 2025-11-24 09:02:12 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{".":{},"f:kubectl.kubernetes.io/last-applied-configuration":{}},"f:labels":{".":{},"f:addonmanager.kubernetes.io/mode":{},"f:k8s-app":{},"f:kubernetes.io/minikube-addons":{}}},"f:spec":{"f:internalTrafficPolicy":{},"f:ports":{".":{},"k:{\"port\":80,\"protocol\":\"TCP\"}":{".":{},"f:port":{},"f:protocol":{},"f:targetPort":{}}},"f:selector":{},"f:sessionAffinity":{},"f:type":{}}} }]},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:,Protocol:TCP,Port:80,TargetPort:{0 9090 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: kubernetes-dashboard,},ClusterIP:10.96.163.151,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.96.163.151],IPFamilies:[IPv4],AllocateLoadBalance
rNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}
W1124 09:02:12.492931 1692489 out.go:285] * Launching proxy ...
* Launching proxy ...
I1124 09:02:12.493047 1692489 dashboard.go:154] Executing: /usr/local/bin/kubectl [/usr/local/bin/kubectl --context functional-941011 proxy --port 36195]
I1124 09:02:12.493330 1692489 dashboard.go:159] Waiting for kubectl to output host:port ...
I1124 09:02:12.552710 1692489 dashboard.go:177] proxy stdout: Starting to serve on 127.0.0.1:36195
W1124 09:02:12.552764 1692489 out.go:285] * Verifying proxy health ...
* Verifying proxy health ...
I1124 09:02:12.572374 1692489 dashboard.go:216] http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: <nil> &{Status:503 Service Unavailable StatusCode:503 Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 Header:map[Audit-Id:[4c96c69b-c9a5-4cb2-97f7-ec090f73bbe1] Cache-Control:[no-cache, private] Content-Length:[182] Content-Type:[application/json] Date:[Mon, 24 Nov 2025 09:02:12 GMT]] Body:0x40006f3580 ContentLength:182 TransferEncoding:[] Close:false Uncompressed:false Trailer:map[] Request:0x400026a140 TLS:<nil>}
I1124 09:02:12.572457 1692489 retry.go:31] will retry after 85.291µs: Temporary Error: unexpected response code: 503
I1124 09:02:12.577794 1692489 dashboard.go:216] http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: <nil> &{Status:503 Service Unavailable StatusCode:503 Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 Header:map[Audit-Id:[70bf736c-9eec-4e2c-9112-042d5456cf88] Cache-Control:[no-cache, private] Content-Length:[182] Content-Type:[application/json] Date:[Mon, 24 Nov 2025 09:02:12 GMT]] Body:0x4000709b40 ContentLength:182 TransferEncoding:[] Close:false Uncompressed:false Trailer:map[] Request:0x400026a280 TLS:<nil>}
I1124 09:02:12.577867 1692489 retry.go:31] will retry after 171.075µs: Temporary Error: unexpected response code: 503
I1124 09:02:12.592040 1692489 dashboard.go:216] http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: <nil> &{Status:503 Service Unavailable StatusCode:503 Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 Header:map[Audit-Id:[03b0bca1-661d-4b38-b7d5-8500518ff5b3] Cache-Control:[no-cache, private] Content-Length:[182] Content-Type:[application/json] Date:[Mon, 24 Nov 2025 09:02:12 GMT]] Body:0x4000709bc0 ContentLength:182 TransferEncoding:[] Close:false Uncompressed:false Trailer:map[] Request:0x40004d8a00 TLS:<nil>}
I1124 09:02:12.592103 1692489 retry.go:31] will retry after 207.898µs: Temporary Error: unexpected response code: 503
I1124 09:02:12.596310 1692489 dashboard.go:216] http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: <nil> &{Status:503 Service Unavailable StatusCode:503 Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 Header:map[Audit-Id:[e5b45554-ea8b-433c-8707-925c88138a5b] Cache-Control:[no-cache, private] Content-Length:[182] Content-Type:[application/json] Date:[Mon, 24 Nov 2025 09:02:12 GMT]] Body:0x4000709c40 ContentLength:182 TransferEncoding:[] Close:false Uncompressed:false Trailer:map[] Request:0x40004d8b40 TLS:<nil>}
I1124 09:02:12.596368 1692489 retry.go:31] will retry after 480.866µs: Temporary Error: unexpected response code: 503
I1124 09:02:12.600753 1692489 dashboard.go:216] http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: <nil> &{Status:503 Service Unavailable StatusCode:503 Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 Header:map[Audit-Id:[5ea8414c-90b4-4625-b03e-5aa9d6ef467f] Cache-Control:[no-cache, private] Content-Length:[182] Content-Type:[application/json] Date:[Mon, 24 Nov 2025 09:02:12 GMT]] Body:0x4000709cc0 ContentLength:182 TransferEncoding:[] Close:false Uncompressed:false Trailer:map[] Request:0x40004d8dc0 TLS:<nil>}
I1124 09:02:12.600814 1692489 retry.go:31] will retry after 714.05µs: Temporary Error: unexpected response code: 503
I1124 09:02:12.604670 1692489 dashboard.go:216] http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: <nil> &{Status:503 Service Unavailable StatusCode:503 Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 Header:map[Audit-Id:[46a66c56-0257-418d-8ace-254d3b1e7040] Cache-Control:[no-cache, private] Content-Length:[182] Content-Type:[application/json] Date:[Mon, 24 Nov 2025 09:02:12 GMT]] Body:0x4000709d40 ContentLength:182 TransferEncoding:[] Close:false Uncompressed:false Trailer:map[] Request:0x40004d8f00 TLS:<nil>}
I1124 09:02:12.604742 1692489 retry.go:31] will retry after 543.013µs: Temporary Error: unexpected response code: 503
I1124 09:02:12.608559 1692489 dashboard.go:216] http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: <nil> &{Status:503 Service Unavailable StatusCode:503 Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 Header:map[Audit-Id:[95b99559-6e17-4ffb-a335-6badc13fcaf8] Cache-Control:[no-cache, private] Content-Length:[182] Content-Type:[application/json] Date:[Mon, 24 Nov 2025 09:02:12 GMT]] Body:0x4000709dc0 ContentLength:182 TransferEncoding:[] Close:false Uncompressed:false Trailer:map[] Request:0x40004d9040 TLS:<nil>}
I1124 09:02:12.608616 1692489 retry.go:31] will retry after 1.2063ms: Temporary Error: unexpected response code: 503
I1124 09:02:12.613673 1692489 dashboard.go:216] http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: <nil> &{Status:503 Service Unavailable StatusCode:503 Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 Header:map[Audit-Id:[fe1df291-661e-453e-8578-5d16bd9dfd35] Cache-Control:[no-cache, private] Content-Length:[182] Content-Type:[application/json] Date:[Mon, 24 Nov 2025 09:02:12 GMT]] Body:0x4000709e40 ContentLength:182 TransferEncoding:[] Close:false Uncompressed:false Trailer:map[] Request:0x40004d9180 TLS:<nil>}
I1124 09:02:12.613795 1692489 retry.go:31] will retry after 1.967681ms: Temporary Error: unexpected response code: 503
I1124 09:02:12.621581 1692489 dashboard.go:216] http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: <nil> &{Status:503 Service Unavailable StatusCode:503 Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 Header:map[Audit-Id:[7e3209a7-81ae-4d3e-8a0e-67f454084c6f] Cache-Control:[no-cache, private] Content-Length:[182] Content-Type:[application/json] Date:[Mon, 24 Nov 2025 09:02:12 GMT]] Body:0x40006f3a00 ContentLength:182 TransferEncoding:[] Close:false Uncompressed:false Trailer:map[] Request:0x400026a3c0 TLS:<nil>}
I1124 09:02:12.621654 1692489 retry.go:31] will retry after 3.3618ms: Temporary Error: unexpected response code: 503
I1124 09:02:12.638407 1692489 dashboard.go:216] http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: <nil> &{Status:503 Service Unavailable StatusCode:503 Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 Header:map[Audit-Id:[5658b05f-9958-4eef-b3ce-a1da9225e4d3] Cache-Control:[no-cache, private] Content-Length:[182] Content-Type:[application/json] Date:[Mon, 24 Nov 2025 09:02:12 GMT]] Body:0x4000709f40 ContentLength:182 TransferEncoding:[] Close:false Uncompressed:false Trailer:map[] Request:0x40004d92c0 TLS:<nil>}
I1124 09:02:12.638506 1692489 retry.go:31] will retry after 4.301298ms: Temporary Error: unexpected response code: 503
I1124 09:02:12.646625 1692489 dashboard.go:216] http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: <nil> &{Status:503 Service Unavailable StatusCode:503 Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 Header:map[Audit-Id:[a12f50b9-a0e3-494a-80a5-f605d4c48342] Cache-Control:[no-cache, private] Content-Length:[182] Content-Type:[application/json] Date:[Mon, 24 Nov 2025 09:02:12 GMT]] Body:0x40004c4a40 ContentLength:182 TransferEncoding:[] Close:false Uncompressed:false Trailer:map[] Request:0x40004d9400 TLS:<nil>}
I1124 09:02:12.646695 1692489 retry.go:31] will retry after 7.218649ms: Temporary Error: unexpected response code: 503
I1124 09:02:12.657668 1692489 dashboard.go:216] http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: <nil> &{Status:503 Service Unavailable StatusCode:503 Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 Header:map[Audit-Id:[75962ca5-5b6e-4e59-b236-059f82d6068a] Cache-Control:[no-cache, private] Content-Length:[182] Content-Type:[application/json] Date:[Mon, 24 Nov 2025 09:02:12 GMT]] Body:0x40006f3c80 ContentLength:182 TransferEncoding:[] Close:false Uncompressed:false Trailer:map[] Request:0x40004d9540 TLS:<nil>}
I1124 09:02:12.657734 1692489 retry.go:31] will retry after 6.881187ms: Temporary Error: unexpected response code: 503
I1124 09:02:12.667894 1692489 dashboard.go:216] http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: <nil> &{Status:503 Service Unavailable StatusCode:503 Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 Header:map[Audit-Id:[91954b25-5ff0-476d-a4b2-858c0f0bba7e] Cache-Control:[no-cache, private] Content-Length:[182] Content-Type:[application/json] Date:[Mon, 24 Nov 2025 09:02:12 GMT]] Body:0x400041a2c0 ContentLength:182 TransferEncoding:[] Close:false Uncompressed:false Trailer:map[] Request:0x400026a500 TLS:<nil>}
I1124 09:02:12.667963 1692489 retry.go:31] will retry after 19.286069ms: Temporary Error: unexpected response code: 503
I1124 09:02:12.691427 1692489 dashboard.go:216] http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: <nil> &{Status:503 Service Unavailable StatusCode:503 Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 Header:map[Audit-Id:[2a6480d4-be09-4bea-9ec3-f0b080e192f1] Cache-Control:[no-cache, private] Content-Length:[182] Content-Type:[application/json] Date:[Mon, 24 Nov 2025 09:02:12 GMT]] Body:0x400041ab00 ContentLength:182 TransferEncoding:[] Close:false Uncompressed:false Trailer:map[] Request:0x40004d9680 TLS:<nil>}
I1124 09:02:12.691496 1692489 retry.go:31] will retry after 22.892109ms: Temporary Error: unexpected response code: 503
I1124 09:02:12.718025 1692489 dashboard.go:216] http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: <nil> &{Status:503 Service Unavailable StatusCode:503 Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 Header:map[Audit-Id:[b30e169d-9393-4302-a2ed-47df668d2ba9] Cache-Control:[no-cache, private] Content-Length:[182] Content-Type:[application/json] Date:[Mon, 24 Nov 2025 09:02:12 GMT]] Body:0x40004c4bc0 ContentLength:182 TransferEncoding:[] Close:false Uncompressed:false Trailer:map[] Request:0x40004d97c0 TLS:<nil>}
I1124 09:02:12.718132 1692489 retry.go:31] will retry after 25.703926ms: Temporary Error: unexpected response code: 503
I1124 09:02:12.747798 1692489 dashboard.go:216] http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: <nil> &{Status:503 Service Unavailable StatusCode:503 Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 Header:map[Audit-Id:[dbb637d6-1eb4-4242-bc7f-aa386e803757] Cache-Control:[no-cache, private] Content-Length:[182] Content-Type:[application/json] Date:[Mon, 24 Nov 2025 09:02:12 GMT]] Body:0x400041ae00 ContentLength:182 TransferEncoding:[] Close:false Uncompressed:false Trailer:map[] Request:0x400026a640 TLS:<nil>}
I1124 09:02:12.747881 1692489 retry.go:31] will retry after 43.368204ms: Temporary Error: unexpected response code: 503
I1124 09:02:12.795796 1692489 dashboard.go:216] http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: <nil> &{Status:503 Service Unavailable StatusCode:503 Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 Header:map[Audit-Id:[0626d3b2-5fcc-47e6-bcb1-7880ae45ef00] Cache-Control:[no-cache, private] Content-Length:[182] Content-Type:[application/json] Date:[Mon, 24 Nov 2025 09:02:12 GMT]] Body:0x400041b640 ContentLength:182 TransferEncoding:[] Close:false Uncompressed:false Trailer:map[] Request:0x400026a780 TLS:<nil>}
I1124 09:02:12.795862 1692489 retry.go:31] will retry after 61.221394ms: Temporary Error: unexpected response code: 503
I1124 09:02:12.861769 1692489 dashboard.go:216] http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: <nil> &{Status:503 Service Unavailable StatusCode:503 Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 Header:map[Audit-Id:[f880d122-e396-4b0f-a47f-0d41e2d8dbb7] Cache-Control:[no-cache, private] Content-Length:[182] Content-Type:[application/json] Date:[Mon, 24 Nov 2025 09:02:12 GMT]] Body:0x400032e080 ContentLength:182 TransferEncoding:[] Close:false Uncompressed:false Trailer:map[] Request:0x400026a8c0 TLS:<nil>}
I1124 09:02:12.861865 1692489 retry.go:31] will retry after 128.409858ms: Temporary Error: unexpected response code: 503
I1124 09:02:12.994431 1692489 dashboard.go:216] http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: <nil> &{Status:503 Service Unavailable StatusCode:503 Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 Header:map[Audit-Id:[e7297136-df82-4cc1-9e2e-6a498e515ce2] Cache-Control:[no-cache, private] Content-Length:[182] Content-Type:[application/json] Date:[Mon, 24 Nov 2025 09:02:12 GMT]] Body:0x400032e680 ContentLength:182 TransferEncoding:[] Close:false Uncompressed:false Trailer:map[] Request:0x40004d9900 TLS:<nil>}
I1124 09:02:12.994564 1692489 retry.go:31] will retry after 106.279338ms: Temporary Error: unexpected response code: 503
I1124 09:02:13.104874 1692489 dashboard.go:216] http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: <nil> &{Status:503 Service Unavailable StatusCode:503 Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 Header:map[Audit-Id:[9af90a81-d924-473a-aeff-447e750246a5] Cache-Control:[no-cache, private] Content-Length:[182] Content-Type:[application/json] Date:[Mon, 24 Nov 2025 09:02:13 GMT]] Body:0x40004c5140 ContentLength:182 TransferEncoding:[] Close:false Uncompressed:false Trailer:map[] Request:0x400026aa00 TLS:<nil>}
I1124 09:02:13.104937 1692489 retry.go:31] will retry after 149.208832ms: Temporary Error: unexpected response code: 503
I1124 09:02:13.259467 1692489 dashboard.go:216] http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: <nil> &{Status:503 Service Unavailable StatusCode:503 Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 Header:map[Audit-Id:[2b0a3a53-f1c4-4e68-a6be-c050bf62046f] Cache-Control:[no-cache, private] Content-Length:[182] Content-Type:[application/json] Date:[Mon, 24 Nov 2025 09:02:13 GMT]] Body:0x40004c51c0 ContentLength:182 TransferEncoding:[] Close:false Uncompressed:false Trailer:map[] Request:0x40004d9a40 TLS:<nil>}
I1124 09:02:13.259533 1692489 retry.go:31] will retry after 470.175046ms: Temporary Error: unexpected response code: 503
I1124 09:02:13.732888 1692489 dashboard.go:216] http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: <nil> &{Status:503 Service Unavailable StatusCode:503 Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 Header:map[Audit-Id:[b2200986-bc3d-46ba-b78e-8f32f6c32c15] Cache-Control:[no-cache, private] Content-Length:[188] Content-Type:[application/json] Date:[Mon, 24 Nov 2025 09:02:13 GMT]] Body:0x4001686000 ContentLength:188 TransferEncoding:[] Close:false Uncompressed:false Trailer:map[] Request:0x400026ab40 TLS:<nil>}
I1124 09:02:13.732970 1692489 retry.go:31] will retry after 271.05686ms: Temporary Error: unexpected response code: 503
I1124 09:02:14.008795 1692489 dashboard.go:216] http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: <nil> &{Status:503 Service Unavailable StatusCode:503 Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 Header:map[Audit-Id:[98108326-afe0-45a6-b32f-d4129e009a77] Cache-Control:[no-cache, private] Content-Length:[188] Content-Type:[application/json] Date:[Mon, 24 Nov 2025 09:02:14 GMT]] Body:0x40016860c0 ContentLength:188 TransferEncoding:[] Close:false Uncompressed:false Trailer:map[] Request:0x40004d9b80 TLS:<nil>}
I1124 09:02:14.008893 1692489 retry.go:31] will retry after 662.836049ms: Temporary Error: unexpected response code: 503
I1124 09:02:14.675565 1692489 dashboard.go:216] http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: <nil> &{Status:503 Service Unavailable StatusCode:503 Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 Header:map[Audit-Id:[52f21689-9869-4f9a-96c1-2d15f52ec5cc] Cache-Control:[no-cache, private] Content-Length:[188] Content-Type:[application/json] Date:[Mon, 24 Nov 2025 09:02:14 GMT]] Body:0x40004c5440 ContentLength:188 TransferEncoding:[] Close:false Uncompressed:false Trailer:map[] Request:0x400026ac80 TLS:<nil>}
I1124 09:02:14.675627 1692489 retry.go:31] will retry after 656.675679ms: Temporary Error: unexpected response code: 503
I1124 09:02:15.335343 1692489 dashboard.go:216] http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: <nil> &{Status:503 Service Unavailable StatusCode:503 Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 Header:map[Audit-Id:[2e0b4440-4513-407e-9d29-0d98a3b06ac0] Cache-Control:[no-cache, private] Content-Length:[188] Content-Type:[application/json] Date:[Mon, 24 Nov 2025 09:02:15 GMT]] Body:0x4001686140 ContentLength:188 TransferEncoding:[] Close:false Uncompressed:false Trailer:map[] Request:0x400026adc0 TLS:<nil>}
I1124 09:02:15.335410 1692489 retry.go:31] will retry after 2.302907808s: Temporary Error: unexpected response code: 503
I1124 09:02:17.642648 1692489 dashboard.go:216] http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: <nil> &{Status:503 Service Unavailable StatusCode:503 Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 Header:map[Audit-Id:[56791bad-f047-44eb-bdd5-0960ba19f086] Cache-Control:[no-cache, private] Content-Length:[188] Content-Type:[application/json] Date:[Mon, 24 Nov 2025 09:02:17 GMT]] Body:0x4001686200 ContentLength:188 TransferEncoding:[] Close:false Uncompressed:false Trailer:map[] Request:0x40004d9cc0 TLS:<nil>}
I1124 09:02:17.642712 1692489 retry.go:31] will retry after 3.690915433s: Temporary Error: unexpected response code: 503
I1124 09:02:21.336798 1692489 dashboard.go:216] http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: <nil> &{Status:503 Service Unavailable StatusCode:503 Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 Header:map[Audit-Id:[266e373e-d664-42be-b06f-42e21570a83b] Cache-Control:[no-cache, private] Content-Length:[188] Content-Type:[application/json] Date:[Mon, 24 Nov 2025 09:02:21 GMT]] Body:0x40004c5640 ContentLength:188 TransferEncoding:[] Close:false Uncompressed:false Trailer:map[] Request:0x400026af00 TLS:<nil>}
I1124 09:02:21.336859 1692489 retry.go:31] will retry after 2.263725327s: Temporary Error: unexpected response code: 503
I1124 09:02:23.604166 1692489 dashboard.go:216] http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: <nil> &{Status:503 Service Unavailable StatusCode:503 Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 Header:map[Audit-Id:[7c09f636-0173-4fa7-95d6-9d2300108fa4] Cache-Control:[no-cache, private] Content-Length:[188] Content-Type:[application/json] Date:[Mon, 24 Nov 2025 09:02:23 GMT]] Body:0x40016862c0 ContentLength:188 TransferEncoding:[] Close:false Uncompressed:false Trailer:map[] Request:0x400026b040 TLS:<nil>}
I1124 09:02:23.604237 1692489 retry.go:31] will retry after 8.372865565s: Temporary Error: unexpected response code: 503
I1124 09:02:31.981190 1692489 dashboard.go:216] http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: <nil> &{Status:503 Service Unavailable StatusCode:503 Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 Header:map[Audit-Id:[3de9dbde-c883-4049-8f5b-2c80d2720c55] Cache-Control:[no-cache, private] Content-Length:[188] Content-Type:[application/json] Date:[Mon, 24 Nov 2025 09:02:31 GMT]] Body:0x40016863c0 ContentLength:188 TransferEncoding:[] Close:false Uncompressed:false Trailer:map[] Request:0x40004d9e00 TLS:<nil>}
I1124 09:02:31.981252 1692489 retry.go:31] will retry after 4.782067129s: Temporary Error: unexpected response code: 503
I1124 09:02:36.766970 1692489 dashboard.go:216] http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: <nil> &{Status:503 Service Unavailable StatusCode:503 Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 Header:map[Audit-Id:[69760234-466f-40f1-a9aa-6ee5567b8914] Cache-Control:[no-cache, private] Content-Length:[188] Content-Type:[application/json] Date:[Mon, 24 Nov 2025 09:02:36 GMT]] Body:0x40004c5740 ContentLength:188 TransferEncoding:[] Close:false Uncompressed:false Trailer:map[] Request:0x40003c6000 TLS:<nil>}
I1124 09:02:36.767032 1692489 retry.go:31] will retry after 10.262243469s: Temporary Error: unexpected response code: 503
I1124 09:02:47.032718 1692489 dashboard.go:216] http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: <nil> &{Status:503 Service Unavailable StatusCode:503 Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 Header:map[Audit-Id:[ecd058d2-3e47-43fa-bda5-ff4ce1af2c04] Cache-Control:[no-cache, private] Content-Length:[188] Content-Type:[application/json] Date:[Mon, 24 Nov 2025 09:02:47 GMT]] Body:0x40004c5800 ContentLength:188 TransferEncoding:[] Close:false Uncompressed:false Trailer:map[] Request:0x400026b2c0 TLS:<nil>}
I1124 09:02:47.032784 1692489 retry.go:31] will retry after 18.5227278s: Temporary Error: unexpected response code: 503
I1124 09:03:05.561551 1692489 dashboard.go:216] http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: <nil> &{Status:503 Service Unavailable StatusCode:503 Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 Header:map[Audit-Id:[2e3f773e-20b2-44dc-9659-f069a3dd0255] Cache-Control:[no-cache, private] Content-Length:[188] Content-Type:[application/json] Date:[Mon, 24 Nov 2025 09:03:05 GMT]] Body:0x40004c58c0 ContentLength:188 TransferEncoding:[] Close:false Uncompressed:false Trailer:map[] Request:0x400026b680 TLS:<nil>}
I1124 09:03:05.561620 1692489 retry.go:31] will retry after 25.246715387s: Temporary Error: unexpected response code: 503
I1124 09:03:30.813598 1692489 dashboard.go:216] http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: <nil> &{Status:503 Service Unavailable StatusCode:503 Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 Header:map[Audit-Id:[ea089389-32ed-4514-9c84-3a8aac828a70] Cache-Control:[no-cache, private] Content-Length:[188] Content-Type:[application/json] Date:[Mon, 24 Nov 2025 09:03:30 GMT]] Body:0x4001686580 ContentLength:188 TransferEncoding:[] Close:false Uncompressed:false Trailer:map[] Request:0x40003c6140 TLS:<nil>}
I1124 09:03:30.813661 1692489 retry.go:31] will retry after 26.577515982s: Temporary Error: unexpected response code: 503
I1124 09:03:57.397717 1692489 dashboard.go:216] http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: <nil> &{Status:503 Service Unavailable StatusCode:503 Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 Header:map[Audit-Id:[4b6de510-80f9-43f5-9ffc-1f258794ec5b] Cache-Control:[no-cache, private] Content-Length:[188] Content-Type:[application/json] Date:[Mon, 24 Nov 2025 09:03:57 GMT]] Body:0x4001686600 ContentLength:188 TransferEncoding:[] Close:false Uncompressed:false Trailer:map[] Request:0x400026b7c0 TLS:<nil>}
I1124 09:03:57.397782 1692489 retry.go:31] will retry after 32.015713212s: Temporary Error: unexpected response code: 503
I1124 09:04:29.417575 1692489 dashboard.go:216] http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: <nil> &{Status:503 Service Unavailable StatusCode:503 Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 Header:map[Audit-Id:[b7d90ef8-2ea1-4bed-98e5-e9457de0ab9a] Cache-Control:[no-cache, private] Content-Length:[188] Content-Type:[application/json] Date:[Mon, 24 Nov 2025 09:04:29 GMT]] Body:0x40004c4ac0 ContentLength:188 TransferEncoding:[] Close:false Uncompressed:false Trailer:map[] Request:0x400026a000 TLS:<nil>}
I1124 09:04:29.417644 1692489 retry.go:31] will retry after 57.241621611s: Temporary Error: unexpected response code: 503
I1124 09:05:26.662912 1692489 dashboard.go:216] http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: <nil> &{Status:503 Service Unavailable StatusCode:503 Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 Header:map[Audit-Id:[697f5188-edbc-4355-ada4-9624217ac50a] Cache-Control:[no-cache, private] Content-Length:[188] Content-Type:[application/json] Date:[Mon, 24 Nov 2025 09:05:26 GMT]] Body:0x40004c4c00 ContentLength:188 TransferEncoding:[] Close:false Uncompressed:false Trailer:map[] Request:0x400026b900 TLS:<nil>}
I1124 09:05:26.662980 1692489 retry.go:31] will retry after 1m6.873787394s: Temporary Error: unexpected response code: 503
I1124 09:06:33.540019 1692489 dashboard.go:216] http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: <nil> &{Status:503 Service Unavailable StatusCode:503 Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 Header:map[Audit-Id:[1e42418e-5ddd-4298-9658-64a3b4e8499a] Cache-Control:[no-cache, private] Content-Length:[188] Content-Type:[application/json] Date:[Mon, 24 Nov 2025 09:06:33 GMT]] Body:0x40016860c0 ContentLength:188 TransferEncoding:[] Close:false Uncompressed:false Trailer:map[] Request:0x400026ba40 TLS:<nil>}
I1124 09:06:33.540090 1692489 retry.go:31] will retry after 1m14.956562734s: Temporary Error: unexpected response code: 503
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestFunctional/parallel/DashboardCmd]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestFunctional/parallel/DashboardCmd]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect functional-941011
helpers_test.go:243: (dbg) docker inspect functional-941011:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "d8574c2bf48ce967fe6e255a178ebf105b1a3019c24cf41f2b79f5302c127773",
	        "Created": "2025-11-24T08:53:47.57593314Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 1679494,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-11-24T08:53:47.634288317Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:572c983e466f1f784136812eef5cc59ac623db764bc7704d3676c4643993fd08",
	        "ResolvConfPath": "/var/lib/docker/containers/d8574c2bf48ce967fe6e255a178ebf105b1a3019c24cf41f2b79f5302c127773/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/d8574c2bf48ce967fe6e255a178ebf105b1a3019c24cf41f2b79f5302c127773/hostname",
	        "HostsPath": "/var/lib/docker/containers/d8574c2bf48ce967fe6e255a178ebf105b1a3019c24cf41f2b79f5302c127773/hosts",
	        "LogPath": "/var/lib/docker/containers/d8574c2bf48ce967fe6e255a178ebf105b1a3019c24cf41f2b79f5302c127773/d8574c2bf48ce967fe6e255a178ebf105b1a3019c24cf41f2b79f5302c127773-json.log",
	        "Name": "/functional-941011",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-941011:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-941011",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "d8574c2bf48ce967fe6e255a178ebf105b1a3019c24cf41f2b79f5302c127773",
	                "LowerDir": "/var/lib/docker/overlay2/6eeb35a95c7cafb92e460f903c5570446f5d8f4a8b4814f2aec79043c07ef0d3-init/diff:/var/lib/docker/overlay2/bf294786c7ade59cb761c7bdcc27045362c3779b00a569b685443f34ade17737/diff",
	                "MergedDir": "/var/lib/docker/overlay2/6eeb35a95c7cafb92e460f903c5570446f5d8f4a8b4814f2aec79043c07ef0d3/merged",
	                "UpperDir": "/var/lib/docker/overlay2/6eeb35a95c7cafb92e460f903c5570446f5d8f4a8b4814f2aec79043c07ef0d3/diff",
	                "WorkDir": "/var/lib/docker/overlay2/6eeb35a95c7cafb92e460f903c5570446f5d8f4a8b4814f2aec79043c07ef0d3/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-941011",
	                "Source": "/var/lib/docker/volumes/functional-941011/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-941011",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-941011",
	                "name.minikube.sigs.k8s.io": "functional-941011",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "91cbecc0d651d94558cb202589b12e740389d40de185d06770e23f82cb68fc8d",
	            "SandboxKey": "/var/run/docker/netns/91cbecc0d651",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34679"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34680"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34683"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34681"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34682"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-941011": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "36:03:d7:7f:e5:c7",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "e7f7867899e274c20f652612139490d61ff49918c5fef46ebcab3194d02671b8",
	                    "EndpointID": "ca16f2cc76565150d8b128df549a1bd659112397b50cb5fa5c6631e2b78b03b5",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-941011",
	                        "d8574c2bf48c"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-941011 -n functional-941011
helpers_test.go:252: <<< TestFunctional/parallel/DashboardCmd FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestFunctional/parallel/DashboardCmd]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 logs -n 25
helpers_test.go:255: (dbg) Done: out/minikube-linux-arm64 -p functional-941011 logs -n 25: (1.524920981s)
helpers_test.go:260: TestFunctional/parallel/DashboardCmd logs: 
-- stdout --
	
	==> Audit <==
	┌───────────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│  COMMAND  │                                                               ARGS                                                                │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├───────────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ addons    │ functional-941011 addons list                                                                                                     │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:00 UTC │ 24 Nov 25 09:00 UTC │
	│ addons    │ functional-941011 addons list -o json                                                                                             │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:00 UTC │ 24 Nov 25 09:00 UTC │
	│ mount     │ -p functional-941011 /tmp/TestFunctionalparallelMountCmdany-port600351307/001:/mount-9p --alsologtostderr -v=1                    │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:01 UTC │                     │
	│ ssh       │ functional-941011 ssh findmnt -T /mount-9p | grep 9p                                                                              │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:01 UTC │ 24 Nov 25 09:02 UTC │
	│ ssh       │ functional-941011 ssh -- ls -la /mount-9p                                                                                         │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │ 24 Nov 25 09:02 UTC │
	│ ssh       │ functional-941011 ssh cat /mount-9p/test-1763974919752046340                                                                      │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │ 24 Nov 25 09:02 UTC │
	│ ssh       │ functional-941011 ssh stat /mount-9p/created-by-test                                                                              │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │ 24 Nov 25 09:02 UTC │
	│ ssh       │ functional-941011 ssh stat /mount-9p/created-by-pod                                                                               │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │ 24 Nov 25 09:02 UTC │
	│ ssh       │ functional-941011 ssh sudo umount -f /mount-9p                                                                                    │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │ 24 Nov 25 09:02 UTC │
	│ ssh       │ functional-941011 ssh findmnt -T /mount-9p | grep 9p                                                                              │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ mount     │ -p functional-941011 /tmp/TestFunctionalparallelMountCmdspecific-port1627151296/001:/mount-9p --alsologtostderr -v=1 --port 46464 │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ ssh       │ functional-941011 ssh findmnt -T /mount-9p | grep 9p                                                                              │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │ 24 Nov 25 09:02 UTC │
	│ ssh       │ functional-941011 ssh -- ls -la /mount-9p                                                                                         │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │ 24 Nov 25 09:02 UTC │
	│ ssh       │ functional-941011 ssh sudo umount -f /mount-9p                                                                                    │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ mount     │ -p functional-941011 /tmp/TestFunctionalparallelMountCmdVerifyCleanup1964745822/001:/mount1 --alsologtostderr -v=1                │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ ssh       │ functional-941011 ssh findmnt -T /mount1                                                                                          │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │ 24 Nov 25 09:02 UTC │
	│ mount     │ -p functional-941011 /tmp/TestFunctionalparallelMountCmdVerifyCleanup1964745822/001:/mount2 --alsologtostderr -v=1                │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ mount     │ -p functional-941011 /tmp/TestFunctionalparallelMountCmdVerifyCleanup1964745822/001:/mount3 --alsologtostderr -v=1                │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ ssh       │ functional-941011 ssh findmnt -T /mount2                                                                                          │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │ 24 Nov 25 09:02 UTC │
	│ ssh       │ functional-941011 ssh findmnt -T /mount3                                                                                          │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │ 24 Nov 25 09:02 UTC │
	│ mount     │ -p functional-941011 --kill=true                                                                                                  │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ start     │ -p functional-941011 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd                   │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ start     │ -p functional-941011 --dry-run --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd                             │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ start     │ -p functional-941011 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd                   │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ dashboard │ --url --port 36195 -p functional-941011 --alsologtostderr -v=1                                                                    │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	└───────────┴───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/11/24 09:02:10
	Running on machine: ip-172-31-30-239
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1124 09:02:10.965378 1692444 out.go:360] Setting OutFile to fd 1 ...
	I1124 09:02:10.965542 1692444 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:02:10.965568 1692444 out.go:374] Setting ErrFile to fd 2...
	I1124 09:02:10.965585 1692444 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:02:10.965987 1692444 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
	I1124 09:02:10.966438 1692444 out.go:368] Setting JSON to false
	I1124 09:02:10.967543 1692444 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":27860,"bootTime":1763947071,"procs":190,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1124 09:02:10.967616 1692444 start.go:143] virtualization:  
	I1124 09:02:10.970861 1692444 out.go:179] * [functional-941011] minikube v1.37.0 sur Ubuntu 20.04 (arm64)
	I1124 09:02:10.974642 1692444 out.go:179]   - MINIKUBE_LOCATION=21978
	I1124 09:02:10.974681 1692444 notify.go:221] Checking for updates...
	I1124 09:02:10.980472 1692444 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1124 09:02:10.983548 1692444 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 09:02:10.986537 1692444 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21978-1652607/.minikube
	I1124 09:02:10.989522 1692444 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1124 09:02:10.992492 1692444 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1124 09:02:10.996012 1692444 config.go:182] Loaded profile config "functional-941011": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
	I1124 09:02:10.999624 1692444 driver.go:422] Setting default libvirt URI to qemu:///system
	I1124 09:02:11.026708 1692444 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1124 09:02:11.026833 1692444 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 09:02:11.090005 1692444 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-11-24 09:02:11.080326924 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 09:02:11.090124 1692444 docker.go:319] overlay module found
	I1124 09:02:11.093275 1692444 out.go:179] * Utilisation du pilote docker basé sur le profil existant
	I1124 09:02:11.095997 1692444 start.go:309] selected driver: docker
	I1124 09:02:11.096021 1692444 start.go:927] validating driver "docker" against &{Name:functional-941011 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.2 ClusterName:functional-941011 Namespace:default APIServerHAVIP: APIServerNa
me:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOpt
ions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:02:11.096132 1692444 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1124 09:02:11.099708 1692444 out.go:203] 
	W1124 09:02:11.102517 1692444 out.go:285] X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	I1124 09:02:11.105331 1692444 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                      ATTEMPT             POD ID              POD                                         NAMESPACE
	423ebfbd58861       1611cd07b61d5       5 minutes ago       Exited              mount-munger              0                   acb67d9acb0da       busybox-mount                               default
	85c0699fd2b2d       ce2d2cda2d858       10 minutes ago      Running             echo-server               0                   8f38d234a4a08       hello-node-75c85bcc94-srggf                 default
	e068a2929b9a7       ba04bb24b9575       11 minutes ago      Running             storage-provisioner       2                   1d6d04a32577c       storage-provisioner                         kube-system
	12cb72b6be32a       b178af3d91f80       11 minutes ago      Running             kube-apiserver            0                   43e43c411fcfa       kube-apiserver-functional-941011            kube-system
	bb12959193539       1b34917560f09       11 minutes ago      Running             kube-controller-manager   1                   5f4eb46413316       kube-controller-manager-functional-941011   kube-system
	f631f089f7db7       2c5f0dedd21c2       11 minutes ago      Running             etcd                      1                   abc7abcd5ce10       etcd-functional-941011                      kube-system
	4db14bc0f3747       4f982e73e768a       11 minutes ago      Running             kube-scheduler            1                   0fac3868adf86       kube-scheduler-functional-941011            kube-system
	b6a84160fb5a4       ba04bb24b9575       11 minutes ago      Exited              storage-provisioner       1                   1d6d04a32577c       storage-provisioner                         kube-system
	fc6fc133ebd24       138784d87c9c5       11 minutes ago      Running             coredns                   1                   b1743c0f4c096       coredns-66bc5c9577-slkfz                    kube-system
	73cbc10088bae       94bff1bec29fd       11 minutes ago      Running             kube-proxy                1                   31e96b363c821       kube-proxy-kmdsq                            kube-system
	c6aa1ae3a9097       b1a8c6f707935       11 minutes ago      Running             kindnet-cni               1                   63a1adeff106f       kindnet-vsrrw                               kube-system
	c59aa8a0dc911       138784d87c9c5       12 minutes ago      Exited              coredns                   0                   b1743c0f4c096       coredns-66bc5c9577-slkfz                    kube-system
	ff28a2e89f65b       b1a8c6f707935       12 minutes ago      Exited              kindnet-cni               0                   63a1adeff106f       kindnet-vsrrw                               kube-system
	62155860688a4       94bff1bec29fd       12 minutes ago      Exited              kube-proxy                0                   31e96b363c821       kube-proxy-kmdsq                            kube-system
	39dbd124ad913       4f982e73e768a       13 minutes ago      Exited              kube-scheduler            0                   0fac3868adf86       kube-scheduler-functional-941011            kube-system
	8a2c1ed065d22       1b34917560f09       13 minutes ago      Exited              kube-controller-manager   0                   5f4eb46413316       kube-controller-manager-functional-941011   kube-system
	c67aa399f4b30       2c5f0dedd21c2       13 minutes ago      Exited              etcd                      0                   abc7abcd5ce10       etcd-functional-941011                      kube-system
	
	
	==> containerd <==
	Nov 24 09:03:41 functional-941011 containerd[3558]: time="2025-11-24T09:03:41.767127140Z" level=info msg="stop pulling image docker.io/kubernetesui/metrics-scraper@sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c: active requests=0, bytes read=11047"
	Nov 24 09:03:41 functional-941011 containerd[3558]: time="2025-11-24T09:03:41.767122857Z" level=error msg="PullImage \"docker.io/kubernetesui/metrics-scraper:v1.0.8@sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"docker.io/kubernetesui/metrics-scraper@sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kubernetesui/metrics-scraper/manifests/sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c: 429 Too Many Requests\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit"
	Nov 24 09:03:42 functional-941011 containerd[3558]: time="2025-11-24T09:03:42.373052208Z" level=info msg="PullImage \"docker.io/kubernetesui/dashboard:v2.7.0@sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93\""
	Nov 24 09:03:42 functional-941011 containerd[3558]: time="2025-11-24T09:03:42.783561323Z" level=error msg="PullImage \"docker.io/kubernetesui/dashboard:v2.7.0@sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"docker.io/kubernetesui/dashboard@sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kubernetesui/dashboard/manifests/sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93: 429 Too Many Requests\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit"
	Nov 24 09:03:42 functional-941011 containerd[3558]: time="2025-11-24T09:03:42.783594480Z" level=info msg="stop pulling image docker.io/kubernetesui/dashboard@sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93: active requests=0, bytes read=11015"
	Nov 24 09:03:47 functional-941011 containerd[3558]: time="2025-11-24T09:03:47.371110525Z" level=info msg="PullImage \"kicbase/echo-server:latest\""
	Nov 24 09:03:47 functional-941011 containerd[3558]: time="2025-11-24T09:03:47.786872389Z" level=error msg="PullImage \"kicbase/echo-server:latest\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"docker.io/kicbase/echo-server:latest\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kicbase/echo-server/manifests/sha256:127ac38a2bb9537b7f252addff209ea6801edcac8a92c8b1104dacd66a583ed6: 429 Too Many Requests\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit"
	Nov 24 09:03:47 functional-941011 containerd[3558]: time="2025-11-24T09:03:47.786893936Z" level=info msg="stop pulling image docker.io/kicbase/echo-server:latest: active requests=0, bytes read=10999"
	Nov 24 09:05:03 functional-941011 containerd[3558]: time="2025-11-24T09:05:03.371038534Z" level=info msg="PullImage \"docker.io/kubernetesui/dashboard:v2.7.0@sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93\""
	Nov 24 09:05:03 functional-941011 containerd[3558]: time="2025-11-24T09:05:03.911003679Z" level=error msg="PullImage \"docker.io/kubernetesui/dashboard:v2.7.0@sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"docker.io/kubernetesui/dashboard@sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kubernetesui/dashboard/manifests/sha256:5c52c60663b473628bd98e4ffee7a747ef1f88d8c7bcee957b089fb3f61bdedf: 429 Too Many Requests\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit"
	Nov 24 09:05:03 functional-941011 containerd[3558]: time="2025-11-24T09:05:03.911023216Z" level=info msg="stop pulling image docker.io/kubernetesui/dashboard@sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93: active requests=0, bytes read=12709"
	Nov 24 09:05:12 functional-941011 containerd[3558]: time="2025-11-24T09:05:12.372167699Z" level=info msg="PullImage \"docker.io/kubernetesui/metrics-scraper:v1.0.8@sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c\""
	Nov 24 09:05:12 functional-941011 containerd[3558]: time="2025-11-24T09:05:12.795428551Z" level=info msg="stop pulling image docker.io/kubernetesui/metrics-scraper@sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c: active requests=0, bytes read=11047"
	Nov 24 09:05:12 functional-941011 containerd[3558]: time="2025-11-24T09:05:12.795449277Z" level=error msg="PullImage \"docker.io/kubernetesui/metrics-scraper:v1.0.8@sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"docker.io/kubernetesui/metrics-scraper@sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kubernetesui/metrics-scraper/manifests/sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c: 429 Too Many Requests\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit"
	Nov 24 09:05:44 functional-941011 containerd[3558]: time="2025-11-24T09:05:44.963076451Z" level=info msg="container event discarded" container=64a864b7fd77190ba8724019e3e6a3bcf07aea9f2ccf65c85c8c1e994e477864 type=CONTAINER_CREATED_EVENT
	Nov 24 09:05:44 functional-941011 containerd[3558]: time="2025-11-24T09:05:44.963184613Z" level=info msg="container event discarded" container=64a864b7fd77190ba8724019e3e6a3bcf07aea9f2ccf65c85c8c1e994e477864 type=CONTAINER_STARTED_EVENT
	Nov 24 09:06:36 functional-941011 containerd[3558]: time="2025-11-24T09:06:36.371060427Z" level=info msg="PullImage \"kicbase/echo-server:latest\""
	Nov 24 09:06:36 functional-941011 containerd[3558]: time="2025-11-24T09:06:36.897295345Z" level=error msg="PullImage \"kicbase/echo-server:latest\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"docker.io/kicbase/echo-server:latest\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kicbase/echo-server/manifests/sha256:42a89d9b22e5307cb88494990d5d929c401339f508c0a7e98a4d8ac52623fc5b: 429 Too Many Requests\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit"
	Nov 24 09:06:36 functional-941011 containerd[3558]: time="2025-11-24T09:06:36.897316392Z" level=info msg="stop pulling image docker.io/kicbase/echo-server:latest: active requests=0, bytes read=11740"
	Nov 24 09:07:01 functional-941011 containerd[3558]: time="2025-11-24T09:07:01.518345818Z" level=info msg="container event discarded" container=acb67d9acb0daf9420aa1204ef283901f23261427b26862cf7549b35a5da4d93 type=CONTAINER_CREATED_EVENT
	Nov 24 09:07:01 functional-941011 containerd[3558]: time="2025-11-24T09:07:01.518447308Z" level=info msg="container event discarded" container=acb67d9acb0daf9420aa1204ef283901f23261427b26862cf7549b35a5da4d93 type=CONTAINER_STARTED_EVENT
	Nov 24 09:07:03 functional-941011 containerd[3558]: time="2025-11-24T09:07:03.613134023Z" level=info msg="container event discarded" container=423ebfbd588615eb4bc213b5f471b5301692b734e90d6664d802a7debc9f3474 type=CONTAINER_CREATED_EVENT
	Nov 24 09:07:03 functional-941011 containerd[3558]: time="2025-11-24T09:07:03.694549929Z" level=info msg="container event discarded" container=423ebfbd588615eb4bc213b5f471b5301692b734e90d6664d802a7debc9f3474 type=CONTAINER_STARTED_EVENT
	Nov 24 09:07:03 functional-941011 containerd[3558]: time="2025-11-24T09:07:03.748953231Z" level=info msg="container event discarded" container=423ebfbd588615eb4bc213b5f471b5301692b734e90d6664d802a7debc9f3474 type=CONTAINER_STOPPED_EVENT
	Nov 24 09:07:05 functional-941011 containerd[3558]: time="2025-11-24T09:07:05.533170489Z" level=info msg="container event discarded" container=acb67d9acb0daf9420aa1204ef283901f23261427b26862cf7549b35a5da4d93 type=CONTAINER_STOPPED_EVENT
	
	
	==> coredns [c59aa8a0dc911d866d1284907405697c4c3891d4b4c4423e9f764fbf37efa9c9] <==
	maxprocs: Leaving GOMAXPROCS=2: CPU quota undefined
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 9e2996f8cb67ac53e0259ab1f8d615d07d1beb0bd07e6a1e39769c3bf486a905bb991cc47f8d2f14d0d3a90a87dfc625a0b4c524fed169d8158c40657c0694b1
	CoreDNS-1.12.1
	linux/arm64, go1.24.1, 707c7c1
	[INFO] 127.0.0.1:51507 - 53970 "HINFO IN 1655404113277552318.700530042625674888. udp 56 false 512" NXDOMAIN qr,rd,ra 56 0.01547365s
	[INFO] SIGTERM: Shutting down servers then terminating
	[INFO] plugin/health: Going into lameduck mode for 5s
	
	
	==> coredns [fc6fc133ebd24da2f7fbde321f7cbdf8f57ce1d20c8e70b56a4530409fe91cf6] <==
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 9e2996f8cb67ac53e0259ab1f8d615d07d1beb0bd07e6a1e39769c3bf486a905bb991cc47f8d2f14d0d3a90a87dfc625a0b4c524fed169d8158c40657c0694b1
	CoreDNS-1.12.1
	linux/arm64, go1.24.1, 707c7c1
	[INFO] 127.0.0.1:44127 - 22213 "HINFO IN 7327540771892768101.5397131767721917287. udp 57 false 512" NXDOMAIN qr,rd,ra 57 0.043158014s
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Service: services is forbidden: User "system:serviceaccount:kube-system:coredns" cannot list resource "services" in API group "" at the cluster scope
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Namespace: namespaces is forbidden: User "system:serviceaccount:kube-system:coredns" cannot list resource "namespaces" in API group "" at the cluster scope
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	
	
	==> describe nodes <==
	Name:               functional-941011
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=arm64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=arm64
	                    kubernetes.io/hostname=functional-941011
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=393ee3e0b845623107dce6cda4f48ffd5c3d1811
	                    minikube.k8s.io/name=functional-941011
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2025_11_24T08_54_12_0700
	                    minikube.k8s.io/version=v1.37.0
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Mon, 24 Nov 2025 08:54:08 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  functional-941011
	  AcquireTime:     <unset>
	  RenewTime:       Mon, 24 Nov 2025 09:07:09 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Mon, 24 Nov 2025 09:02:14 +0000   Mon, 24 Nov 2025 08:54:06 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Mon, 24 Nov 2025 09:02:14 +0000   Mon, 24 Nov 2025 08:54:06 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Mon, 24 Nov 2025 09:02:14 +0000   Mon, 24 Nov 2025 08:54:06 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Mon, 24 Nov 2025 09:02:14 +0000   Mon, 24 Nov 2025 08:54:58 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.49.2
	  Hostname:    functional-941011
	Capacity:
	  cpu:                2
	  ephemeral-storage:  203034800Ki
	  hugepages-1Gi:      0
	  hugepages-2Mi:      0
	  hugepages-32Mi:     0
	  hugepages-64Ki:     0
	  memory:             8022296Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  203034800Ki
	  hugepages-1Gi:      0
	  hugepages-2Mi:      0
	  hugepages-32Mi:     0
	  hugepages-64Ki:     0
	  memory:             8022296Ki
	  pods:               110
	System Info:
	  Machine ID:                 7283ea1857f18f20a875c29069214c9d
	  System UUID:                d38b29f6-9a27-498f-a371-693f2677a0b6
	  Boot ID:                    e6ca431c-3a35-478f-87f6-f49cc4bc8a65
	  Kernel Version:             5.15.0-1084-aws
	  OS Image:                   Debian GNU/Linux 12 (bookworm)
	  Operating System:           linux
	  Architecture:               arm64
	  Container Runtime Version:  containerd://2.1.5
	  Kubelet Version:            v1.34.2
	  Kube-Proxy Version:         
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (14 in total)
	  Namespace                   Name                                          CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                          ------------  ----------  ---------------  -------------  ---
	  default                     hello-node-75c85bcc94-srggf                   0 (0%)        0 (0%)      0 (0%)           0 (0%)         10m
	  default                     hello-node-connect-7d85dfc575-vjgqt           0 (0%)        0 (0%)      0 (0%)           0 (0%)         6m28s
	  default                     nginx-svc                                     0 (0%)        0 (0%)      0 (0%)           0 (0%)         10m
	  default                     sp-pod                                        0 (0%)        0 (0%)      0 (0%)           0 (0%)         10m
	  kube-system                 coredns-66bc5c9577-slkfz                      100m (5%)     0 (0%)      70Mi (0%)        170Mi (2%)     12m
	  kube-system                 etcd-functional-941011                        100m (5%)     0 (0%)      100Mi (1%)       0 (0%)         13m
	  kube-system                 kindnet-vsrrw                                 100m (5%)     100m (5%)   50Mi (0%)        50Mi (0%)      12m
	  kube-system                 kube-apiserver-functional-941011              250m (12%)    0 (0%)      0 (0%)           0 (0%)         11m
	  kube-system                 kube-controller-manager-functional-941011     200m (10%)    0 (0%)      0 (0%)           0 (0%)         13m
	  kube-system                 kube-proxy-kmdsq                              0 (0%)        0 (0%)      0 (0%)           0 (0%)         12m
	  kube-system                 kube-scheduler-functional-941011              100m (5%)     0 (0%)      0 (0%)           0 (0%)         13m
	  kube-system                 storage-provisioner                           0 (0%)        0 (0%)      0 (0%)           0 (0%)         12m
	  kubernetes-dashboard        dashboard-metrics-scraper-77bf4d6c4c-5qb25    0 (0%)        0 (0%)      0 (0%)           0 (0%)         5m
	  kubernetes-dashboard        kubernetes-dashboard-855c9754f9-tww5k         0 (0%)        0 (0%)      0 (0%)           0 (0%)         5m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                850m (42%)  100m (5%)
	  memory             220Mi (2%)  220Mi (2%)
	  ephemeral-storage  0 (0%)      0 (0%)
	  hugepages-1Gi      0 (0%)      0 (0%)
	  hugepages-2Mi      0 (0%)      0 (0%)
	  hugepages-32Mi     0 (0%)      0 (0%)
	  hugepages-64Ki     0 (0%)      0 (0%)
	Events:
	  Type     Reason                   Age                From             Message
	  ----     ------                   ----               ----             -------
	  Normal   Starting                 12m                kube-proxy       
	  Normal   Starting                 11m                kube-proxy       
	  Normal   NodeAllocatableEnforced  13m                kubelet          Updated Node Allocatable limit across pods
	  Warning  CgroupV1                 13m                kubelet          cgroup v1 support is in maintenance mode, please migrate to cgroup v2
	  Normal   NodeHasSufficientMemory  13m                kubelet          Node functional-941011 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    13m                kubelet          Node functional-941011 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     13m                kubelet          Node functional-941011 status is now: NodeHasSufficientPID
	  Normal   Starting                 13m                kubelet          Starting kubelet.
	  Normal   RegisteredNode           12m                node-controller  Node functional-941011 event: Registered Node functional-941011 in Controller
	  Normal   NodeReady                12m                kubelet          Node functional-941011 status is now: NodeReady
	  Normal   Starting                 11m                kubelet          Starting kubelet.
	  Warning  CgroupV1                 11m                kubelet          cgroup v1 support is in maintenance mode, please migrate to cgroup v2
	  Normal   NodeHasSufficientMemory  11m (x8 over 11m)  kubelet          Node functional-941011 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    11m (x8 over 11m)  kubelet          Node functional-941011 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     11m (x7 over 11m)  kubelet          Node functional-941011 status is now: NodeHasSufficientPID
	  Normal   NodeAllocatableEnforced  11m                kubelet          Updated Node Allocatable limit across pods
	  Normal   RegisteredNode           11m                node-controller  Node functional-941011 event: Registered Node functional-941011 in Controller
	
	
	==> dmesg <==
	[Nov24 08:20] overlayfs: idmapped layers are currently not supported
	[Nov24 08:21] overlayfs: idmapped layers are currently not supported
	[Nov24 08:22] overlayfs: idmapped layers are currently not supported
	[Nov24 08:23] overlayfs: idmapped layers are currently not supported
	[Nov24 08:24] overlayfs: idmapped layers are currently not supported
	[ +19.566509] overlayfs: idmapped layers are currently not supported
	[Nov24 08:25] overlayfs: idmapped layers are currently not supported
	[ +35.934095] overlayfs: idmapped layers are currently not supported
	[Nov24 08:27] overlayfs: idmapped layers are currently not supported
	[Nov24 08:28] overlayfs: idmapped layers are currently not supported
	[Nov24 08:29] overlayfs: idmapped layers are currently not supported
	[Nov24 08:30] overlayfs: idmapped layers are currently not supported
	[Nov24 08:31] overlayfs: idmapped layers are currently not supported
	[ +44.505180] overlayfs: idmapped layers are currently not supported
	[Nov24 08:32] overlayfs: idmapped layers are currently not supported
	[Nov24 08:33] overlayfs: idmapped layers are currently not supported
	[Nov24 08:34] overlayfs: idmapped layers are currently not supported
	[ +21.137877] overlayfs: idmapped layers are currently not supported
	[ +28.724175] overlayfs: idmapped layers are currently not supported
	[Nov24 08:36] overlayfs: idmapped layers are currently not supported
	[Nov24 08:37] overlayfs: idmapped layers are currently not supported
	[Nov24 08:38] overlayfs: idmapped layers are currently not supported
	[  +4.063834] overlayfs: idmapped layers are currently not supported
	[Nov24 08:40] overlayfs: idmapped layers are currently not supported
	[Nov24 08:42] kauditd_printk_skb: 8 callbacks suppressed
	
	
	==> etcd [c67aa399f4b3099636ae5de0029e6f8778af29edd2cb80d538b3fce9f2752591] <==
	{"level":"warn","ts":"2025-11-24T08:54:08.005514Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:39950","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:54:08.022010Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:39954","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:54:08.048197Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:39980","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:54:08.061624Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:39986","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:54:08.083337Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:39998","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:54:08.099353Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40006","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:54:08.164536Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40022","server-name":"","error":"EOF"}
	{"level":"info","ts":"2025-11-24T08:55:38.679574Z","caller":"osutil/interrupt_unix.go:65","msg":"received signal; shutting down","signal":"terminated"}
	{"level":"info","ts":"2025-11-24T08:55:38.679640Z","caller":"embed/etcd.go:426","msg":"closing etcd server","name":"functional-941011","data-dir":"/var/lib/minikube/etcd","advertise-peer-urls":["https://192.168.49.2:2380"],"advertise-client-urls":["https://192.168.49.2:2379"]}
	{"level":"error","ts":"2025-11-24T08:55:38.679751Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"http: Server closed","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*serveCtx).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/serve.go:90"}
	{"level":"error","ts":"2025-11-24T08:55:38.681678Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"http: Server closed","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*serveCtx).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/serve.go:90"}
	{"level":"warn","ts":"2025-11-24T08:55:38.681817Z","caller":"embed/serve.go:245","msg":"stopping secure grpc server due to error","error":"accept tcp 127.0.0.1:2379: use of closed network connection"}
	{"level":"warn","ts":"2025-11-24T08:55:38.681861Z","caller":"embed/serve.go:247","msg":"stopped secure grpc server due to error","error":"accept tcp 127.0.0.1:2379: use of closed network connection"}
	{"level":"error","ts":"2025-11-24T08:55:38.681870Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"accept tcp 127.0.0.1:2379: use of closed network connection","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*Etcd).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:906"}
	{"level":"warn","ts":"2025-11-24T08:55:38.681935Z","caller":"embed/serve.go:245","msg":"stopping secure grpc server due to error","error":"accept tcp 192.168.49.2:2379: use of closed network connection"}
	{"level":"warn","ts":"2025-11-24T08:55:38.681953Z","caller":"embed/serve.go:247","msg":"stopped secure grpc server due to error","error":"accept tcp 192.168.49.2:2379: use of closed network connection"}
	{"level":"error","ts":"2025-11-24T08:55:38.681992Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"accept tcp 127.0.0.1:2381: use of closed network connection","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*Etcd).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:906"}
	{"level":"info","ts":"2025-11-24T08:55:38.682013Z","caller":"etcdserver/server.go:1297","msg":"skipped leadership transfer for single voting member cluster","local-member-id":"aec36adc501070cc","current-leader-member-id":"aec36adc501070cc"}
	{"level":"error","ts":"2025-11-24T08:55:38.681960Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"accept tcp 192.168.49.2:2379: use of closed network connection","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*Etcd).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:906"}
	{"level":"info","ts":"2025-11-24T08:55:38.682068Z","caller":"etcdserver/server.go:2358","msg":"server has stopped; stopping storage version's monitor"}
	{"level":"info","ts":"2025-11-24T08:55:38.682078Z","caller":"etcdserver/server.go:2335","msg":"server has stopped; stopping cluster version's monitor"}
	{"level":"info","ts":"2025-11-24T08:55:38.685220Z","caller":"embed/etcd.go:621","msg":"stopping serving peer traffic","address":"192.168.49.2:2380"}
	{"level":"error","ts":"2025-11-24T08:55:38.685308Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"accept tcp 192.168.49.2:2380: use of closed network connection","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*Etcd).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:906"}
	{"level":"info","ts":"2025-11-24T08:55:38.685331Z","caller":"embed/etcd.go:626","msg":"stopped serving peer traffic","address":"192.168.49.2:2380"}
	{"level":"info","ts":"2025-11-24T08:55:38.685338Z","caller":"embed/etcd.go:428","msg":"closed etcd server","name":"functional-941011","data-dir":"/var/lib/minikube/etcd","advertise-peer-urls":["https://192.168.49.2:2380"],"advertise-client-urls":["https://192.168.49.2:2379"]}
	
	
	==> etcd [f631f089f7db7469dd50cb6bbc0f39ec97f716fd16e1e26e288c4caf9028b204] <==
	{"level":"warn","ts":"2025-11-24T08:55:45.264118Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40308","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.282780Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40330","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.297953Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40346","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.320828Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40364","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.336722Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40388","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.352974Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40418","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.369743Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40432","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.407571Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40450","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.417153Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40464","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.433463Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40474","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.452186Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40486","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.470636Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40502","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.487399Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40526","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.499700Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40542","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.515500Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40556","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.533300Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40572","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.550559Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40584","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.566082Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40608","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.588289Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40630","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.599854Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40638","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.614758Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40662","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.676226Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40674","server-name":"","error":"EOF"}
	{"level":"info","ts":"2025-11-24T09:05:44.443465Z","caller":"mvcc/index.go:194","msg":"compact tree index","revision":983}
	{"level":"info","ts":"2025-11-24T09:05:44.451637Z","caller":"mvcc/kvstore_compaction.go:70","msg":"finished scheduled compaction","compact-revision":983,"took":"7.878151ms","hash":1980473058,"current-db-size-bytes":3526656,"current-db-size":"3.5 MB","current-db-size-in-use-bytes":3526656,"current-db-size-in-use":"3.5 MB"}
	{"level":"info","ts":"2025-11-24T09:05:44.451689Z","caller":"mvcc/hash.go:157","msg":"storing new hash","hash":1980473058,"revision":983,"compact-revision":-1}
	
	
	==> kernel <==
	 09:07:12 up  7:49,  0 user,  load average: 0.31, 0.39, 1.09
	Linux functional-941011 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kindnet [c6aa1ae3a9097decf5035a00445a3b904d34d773c5e3cbb839ca84c4b31b2ba7] <==
	I1124 09:05:09.323241       1 main.go:301] handling current node
	I1124 09:05:19.321884       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 09:05:19.321919       1 main.go:301] handling current node
	I1124 09:05:29.322553       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 09:05:29.322595       1 main.go:301] handling current node
	I1124 09:05:39.321807       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 09:05:39.321866       1 main.go:301] handling current node
	I1124 09:05:49.321864       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 09:05:49.321903       1 main.go:301] handling current node
	I1124 09:05:59.322372       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 09:05:59.322423       1 main.go:301] handling current node
	I1124 09:06:09.322314       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 09:06:09.322573       1 main.go:301] handling current node
	I1124 09:06:19.321897       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 09:06:19.321931       1 main.go:301] handling current node
	I1124 09:06:29.327925       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 09:06:29.328127       1 main.go:301] handling current node
	I1124 09:06:39.321444       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 09:06:39.321495       1 main.go:301] handling current node
	I1124 09:06:49.321839       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 09:06:49.322063       1 main.go:301] handling current node
	I1124 09:06:59.321733       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 09:06:59.321772       1 main.go:301] handling current node
	I1124 09:07:09.321541       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 09:07:09.321633       1 main.go:301] handling current node
	
	
	==> kindnet [ff28a2e89f65b3323fbc0f41262a0b445c865c0474fbdac4ec03f66b94a6826a] <==
	I1124 08:54:17.914898       1 main.go:139] hostIP = 192.168.49.2
	podIP = 192.168.49.2
	I1124 08:54:17.915083       1 main.go:148] setting mtu 1500 for CNI 
	I1124 08:54:17.915097       1 main.go:178] kindnetd IP family: "ipv4"
	I1124 08:54:17.915113       1 main.go:182] noMask IPv4 subnets: [10.244.0.0/16]
	time="2025-11-24T08:54:18Z" level=info msg="Created plugin 10-kube-network-policies (kindnetd, handles RunPodSandbox,RemovePodSandbox)"
	I1124 08:54:18.119698       1 controller.go:377] "Starting controller" name="kube-network-policies"
	I1124 08:54:18.119719       1 controller.go:381] "Waiting for informer caches to sync"
	I1124 08:54:18.119728       1 shared_informer.go:350] "Waiting for caches to sync" controller="kube-network-policies"
	I1124 08:54:18.119873       1 controller.go:390] nri plugin exited: failed to connect to NRI service: dial unix /var/run/nri/nri.sock: connect: no such file or directory
	E1124 08:54:48.120234       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.96.0.1:443/api/v1/nodes?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: i/o timeout" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Node"
	E1124 08:54:48.120236       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: i/o timeout" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Namespace"
	E1124 08:54:48.120465       1 reflector.go:200] "Failed to watch" err="failed to list *v1.NetworkPolicy: Get \"https://10.96.0.1:443/apis/networking.k8s.io/v1/networkpolicies?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: i/o timeout" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.NetworkPolicy"
	E1124 08:54:48.120608       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Pod: Get \"https://10.96.0.1:443/api/v1/pods?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: i/o timeout" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Pod"
	I1124 08:54:49.719897       1 shared_informer.go:357] "Caches are synced" controller="kube-network-policies"
	I1124 08:54:49.719946       1 metrics.go:72] Registering metrics
	I1124 08:54:49.720169       1 controller.go:711] "Syncing nftables rules"
	I1124 08:54:58.119604       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 08:54:58.119873       1 main.go:301] handling current node
	I1124 08:55:08.122765       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 08:55:08.123311       1 main.go:301] handling current node
	I1124 08:55:18.122591       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 08:55:18.122626       1 main.go:301] handling current node
	I1124 08:55:28.122561       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 08:55:28.122829       1 main.go:301] handling current node
	
	
	==> kube-apiserver [12cb72b6be32a86f06dc22816a54fbdc70bf5efe54d3eba2e90672e835fad88f] <==
	I1124 08:55:46.624336       1 cache.go:39] Caches are synced for autoregister controller
	I1124 08:55:46.624559       1 shared_informer.go:356] "Caches are synced" controller="kubernetes-service-cidr-controller"
	I1124 08:55:46.624611       1 default_servicecidr_controller.go:137] Shutting down kubernetes-service-cidr-controller
	I1124 08:55:46.624734       1 shared_informer.go:356] "Caches are synced" controller="configmaps"
	I1124 08:55:46.637331       1 shared_informer.go:356] "Caches are synced" controller="ipallocator-repair-controller"
	I1124 08:55:46.644590       1 controller.go:667] quota admission added evaluator for: leases.coordination.k8s.io
	I1124 08:55:47.218727       1 storage_scheduling.go:111] all system priority classes are created successfully or already exist.
	I1124 08:55:47.442763       1 controller.go:667] quota admission added evaluator for: serviceaccounts
	W1124 08:55:47.629531       1 lease.go:265] Resetting endpoints for master service "kubernetes" to [192.168.49.2]
	I1124 08:55:47.631201       1 controller.go:667] quota admission added evaluator for: endpoints
	I1124 08:55:47.640973       1 controller.go:667] quota admission added evaluator for: endpointslices.discovery.k8s.io
	I1124 08:55:48.340758       1 controller.go:667] quota admission added evaluator for: daemonsets.apps
	I1124 08:55:48.559757       1 controller.go:667] quota admission added evaluator for: deployments.apps
	I1124 08:55:48.672814       1 controller.go:667] quota admission added evaluator for: roles.rbac.authorization.k8s.io
	I1124 08:55:48.686003       1 controller.go:667] quota admission added evaluator for: rolebindings.rbac.authorization.k8s.io
	I1124 08:55:57.253802       1 controller.go:667] quota admission added evaluator for: replicasets.apps
	I1124 08:56:15.389377       1 alloc.go:328] "allocated clusterIPs" service="default/invalid-svc" clusterIPs={"IPv4":"10.100.46.187"}
	E1124 08:56:19.614283       1 watch.go:272] "Unhandled Error" err="http2: stream closed" logger="UnhandledError"
	I1124 08:56:24.710981       1 alloc.go:328] "allocated clusterIPs" service="default/hello-node" clusterIPs={"IPv4":"10.110.108.17"}
	I1124 08:56:29.396574       1 alloc.go:328] "allocated clusterIPs" service="default/nginx-svc" clusterIPs={"IPv4":"10.99.196.76"}
	I1124 09:00:44.624602       1 alloc.go:328] "allocated clusterIPs" service="default/hello-node-connect" clusterIPs={"IPv4":"10.99.193.46"}
	I1124 09:02:12.158594       1 controller.go:667] quota admission added evaluator for: namespaces
	I1124 09:02:12.437241       1 alloc.go:328] "allocated clusterIPs" service="kubernetes-dashboard/kubernetes-dashboard" clusterIPs={"IPv4":"10.96.163.151"}
	I1124 09:02:12.462775       1 alloc.go:328] "allocated clusterIPs" service="kubernetes-dashboard/dashboard-metrics-scraper" clusterIPs={"IPv4":"10.102.193.246"}
	I1124 09:05:46.560487       1 cidrallocator.go:277] updated ClusterIP allocator for Service CIDR 10.96.0.0/12
	
	
	==> kube-controller-manager [8a2c1ed065d22c98f59a6c1c4a6816d5635a1993d4b79170db5bdf930e6a5464] <==
	I1124 08:54:15.869534       1 shared_informer.go:356] "Caches are synced" controller="PV protection"
	I1124 08:54:15.869543       1 shared_informer.go:356] "Caches are synced" controller="ReplicaSet"
	I1124 08:54:15.869777       1 shared_informer.go:356] "Caches are synced" controller="namespace"
	I1124 08:54:15.878398       1 range_allocator.go:428] "Set node PodCIDR" logger="node-ipam-controller" node="functional-941011" podCIDRs=["10.244.0.0/24"]
	I1124 08:54:15.882870       1 shared_informer.go:356] "Caches are synced" controller="HPA"
	I1124 08:54:15.886073       1 shared_informer.go:356] "Caches are synced" controller="garbage collector"
	I1124 08:54:15.890370       1 shared_informer.go:356] "Caches are synced" controller="PVC protection"
	I1124 08:54:15.903201       1 shared_informer.go:356] "Caches are synced" controller="garbage collector"
	I1124 08:54:15.903228       1 garbagecollector.go:154] "Garbage collector: all resource monitors have synced" logger="garbage-collector-controller"
	I1124 08:54:15.903236       1 garbagecollector.go:157] "Proceeding to collect garbage" logger="garbage-collector-controller"
	I1124 08:54:15.908644       1 shared_informer.go:356] "Caches are synced" controller="disruption"
	I1124 08:54:15.909119       1 shared_informer.go:356] "Caches are synced" controller="validatingadmissionpolicy-status"
	I1124 08:54:15.910226       1 shared_informer.go:356] "Caches are synced" controller="endpoint_slice"
	I1124 08:54:15.910325       1 shared_informer.go:356] "Caches are synced" controller="ClusterRoleAggregator"
	I1124 08:54:15.910394       1 shared_informer.go:356] "Caches are synced" controller="attach detach"
	I1124 08:54:15.910704       1 shared_informer.go:356] "Caches are synced" controller="job"
	I1124 08:54:15.910793       1 shared_informer.go:356] "Caches are synced" controller="GC"
	I1124 08:54:15.910845       1 shared_informer.go:356] "Caches are synced" controller="legacy-service-account-token-cleaner"
	I1124 08:54:15.910971       1 shared_informer.go:356] "Caches are synced" controller="VAC protection"
	I1124 08:54:15.911017       1 shared_informer.go:356] "Caches are synced" controller="taint-eviction-controller"
	I1124 08:54:15.913830       1 shared_informer.go:356] "Caches are synced" controller="endpoint_slice_mirroring"
	I1124 08:54:15.916079       1 shared_informer.go:356] "Caches are synced" controller="persistent volume"
	I1124 08:54:15.918449       1 shared_informer.go:356] "Caches are synced" controller="resource quota"
	I1124 08:54:15.919891       1 shared_informer.go:356] "Caches are synced" controller="ephemeral"
	I1124 08:55:00.864980       1 node_lifecycle_controller.go:1044] "Controller detected that some Nodes are Ready. Exiting master disruption mode" logger="node-lifecycle-controller"
	
	
	==> kube-controller-manager [bb129591935391164c1c4d497b52259d48968c471089ff98292655891beffe48] <==
	I1124 08:55:50.031540       1 shared_informer.go:356] "Caches are synced" controller="resource quota"
	I1124 08:55:50.034699       1 shared_informer.go:356] "Caches are synced" controller="certificate-csrsigning-kubelet-client"
	I1124 08:55:50.034721       1 shared_informer.go:356] "Caches are synced" controller="certificate-csrsigning-kubelet-serving"
	I1124 08:55:50.035887       1 shared_informer.go:356] "Caches are synced" controller="disruption"
	I1124 08:55:50.039119       1 shared_informer.go:356] "Caches are synced" controller="certificate-csrsigning-kube-apiserver-client"
	I1124 08:55:50.040438       1 shared_informer.go:356] "Caches are synced" controller="certificate-csrsigning-legacy-unknown"
	I1124 08:55:50.043772       1 shared_informer.go:356] "Caches are synced" controller="TTL"
	I1124 08:55:50.049118       1 shared_informer.go:356] "Caches are synced" controller="VAC protection"
	I1124 08:55:50.051401       1 shared_informer.go:356] "Caches are synced" controller="garbage collector"
	I1124 08:55:50.054626       1 shared_informer.go:356] "Caches are synced" controller="ephemeral"
	I1124 08:55:50.060982       1 shared_informer.go:356] "Caches are synced" controller="endpoint_slice_mirroring"
	I1124 08:55:50.068706       1 shared_informer.go:356] "Caches are synced" controller="TTL after finished"
	I1124 08:55:50.075379       1 shared_informer.go:356] "Caches are synced" controller="resource_claim"
	I1124 08:55:50.077489       1 shared_informer.go:356] "Caches are synced" controller="endpoint"
	I1124 08:55:50.093310       1 shared_informer.go:356] "Caches are synced" controller="garbage collector"
	I1124 08:55:50.093553       1 garbagecollector.go:154] "Garbage collector: all resource monitors have synced" logger="garbage-collector-controller"
	I1124 08:55:50.093642       1 garbagecollector.go:157] "Proceeding to collect garbage" logger="garbage-collector-controller"
	E1124 09:02:12.261542       1 replica_set.go:587] "Unhandled Error" err="sync \"kubernetes-dashboard/dashboard-metrics-scraper-77bf4d6c4c\" failed with pods \"dashboard-metrics-scraper-77bf4d6c4c-\" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount \"kubernetes-dashboard\" not found" logger="UnhandledError"
	E1124 09:02:12.275551       1 replica_set.go:587] "Unhandled Error" err="sync \"kubernetes-dashboard/dashboard-metrics-scraper-77bf4d6c4c\" failed with pods \"dashboard-metrics-scraper-77bf4d6c4c-\" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount \"kubernetes-dashboard\" not found" logger="UnhandledError"
	E1124 09:02:12.284944       1 replica_set.go:587] "Unhandled Error" err="sync \"kubernetes-dashboard/kubernetes-dashboard-855c9754f9\" failed with pods \"kubernetes-dashboard-855c9754f9-\" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount \"kubernetes-dashboard\" not found" logger="UnhandledError"
	E1124 09:02:12.286796       1 replica_set.go:587] "Unhandled Error" err="sync \"kubernetes-dashboard/dashboard-metrics-scraper-77bf4d6c4c\" failed with pods \"dashboard-metrics-scraper-77bf4d6c4c-\" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount \"kubernetes-dashboard\" not found" logger="UnhandledError"
	E1124 09:02:12.293749       1 replica_set.go:587] "Unhandled Error" err="sync \"kubernetes-dashboard/kubernetes-dashboard-855c9754f9\" failed with pods \"kubernetes-dashboard-855c9754f9-\" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount \"kubernetes-dashboard\" not found" logger="UnhandledError"
	E1124 09:02:12.297333       1 replica_set.go:587] "Unhandled Error" err="sync \"kubernetes-dashboard/dashboard-metrics-scraper-77bf4d6c4c\" failed with pods \"dashboard-metrics-scraper-77bf4d6c4c-\" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount \"kubernetes-dashboard\" not found" logger="UnhandledError"
	E1124 09:02:12.304126       1 replica_set.go:587] "Unhandled Error" err="sync \"kubernetes-dashboard/kubernetes-dashboard-855c9754f9\" failed with pods \"kubernetes-dashboard-855c9754f9-\" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount \"kubernetes-dashboard\" not found" logger="UnhandledError"
	E1124 09:02:12.311952       1 replica_set.go:587] "Unhandled Error" err="sync \"kubernetes-dashboard/kubernetes-dashboard-855c9754f9\" failed with pods \"kubernetes-dashboard-855c9754f9-\" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount \"kubernetes-dashboard\" not found" logger="UnhandledError"
	
	
	==> kube-proxy [62155860688a446bc4f0edf281779e28f4ecf113640dbf9995722ce6bfe4cd55] <==
	I1124 08:54:17.565304       1 server_linux.go:53] "Using iptables proxy"
	I1124 08:54:17.660540       1 shared_informer.go:349] "Waiting for caches to sync" controller="node informer cache"
	I1124 08:54:17.760879       1 shared_informer.go:356] "Caches are synced" controller="node informer cache"
	I1124 08:54:17.760920       1 server.go:219] "Successfully retrieved NodeIPs" NodeIPs=["192.168.49.2"]
	E1124 08:54:17.761072       1 server.go:256] "Kube-proxy configuration may be incomplete or incorrect" err="nodePortAddresses is unset; NodePort connections will be accepted on all local IPs. Consider using `--nodeport-addresses primary`"
	I1124 08:54:17.852239       1 server.go:265] "kube-proxy running in dual-stack mode" primary ipFamily="IPv4"
	I1124 08:54:17.852307       1 server_linux.go:132] "Using iptables Proxier"
	I1124 08:54:17.866599       1 proxier.go:242] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses" ipFamily="IPv4"
	I1124 08:54:17.866936       1 server.go:527] "Version info" version="v1.34.2"
	I1124 08:54:17.866952       1 server.go:529] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I1124 08:54:17.868618       1 config.go:200] "Starting service config controller"
	I1124 08:54:17.868629       1 shared_informer.go:349] "Waiting for caches to sync" controller="service config"
	I1124 08:54:17.868646       1 config.go:106] "Starting endpoint slice config controller"
	I1124 08:54:17.868649       1 shared_informer.go:349] "Waiting for caches to sync" controller="endpoint slice config"
	I1124 08:54:17.868660       1 config.go:403] "Starting serviceCIDR config controller"
	I1124 08:54:17.868666       1 shared_informer.go:349] "Waiting for caches to sync" controller="serviceCIDR config"
	I1124 08:54:17.876964       1 config.go:309] "Starting node config controller"
	I1124 08:54:17.876996       1 shared_informer.go:349] "Waiting for caches to sync" controller="node config"
	I1124 08:54:17.877013       1 shared_informer.go:356] "Caches are synced" controller="node config"
	I1124 08:54:17.969733       1 shared_informer.go:356] "Caches are synced" controller="service config"
	I1124 08:54:17.969806       1 shared_informer.go:356] "Caches are synced" controller="endpoint slice config"
	I1124 08:54:17.972675       1 shared_informer.go:356] "Caches are synced" controller="serviceCIDR config"
	
	
	==> kube-proxy [73cbc10088baee047f5af65dd9d3da14101d5c79193d1084620f4ef78567007c] <==
	I1124 08:55:29.304214       1 shared_informer.go:349] "Waiting for caches to sync" controller="node informer cache"
	E1124 08:55:29.305266       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-941011&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1124 08:55:30.871934       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-941011&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1124 08:55:32.698486       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-941011&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1124 08:55:38.932652       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-941011&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	I1124 08:55:46.604339       1 shared_informer.go:356] "Caches are synced" controller="node informer cache"
	I1124 08:55:46.604382       1 server.go:219] "Successfully retrieved NodeIPs" NodeIPs=["192.168.49.2"]
	E1124 08:55:46.604726       1 server.go:256] "Kube-proxy configuration may be incomplete or incorrect" err="nodePortAddresses is unset; NodePort connections will be accepted on all local IPs. Consider using `--nodeport-addresses primary`"
	I1124 08:55:46.633087       1 server.go:265] "kube-proxy running in dual-stack mode" primary ipFamily="IPv4"
	I1124 08:55:46.633170       1 server_linux.go:132] "Using iptables Proxier"
	I1124 08:55:46.637676       1 proxier.go:242] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses" ipFamily="IPv4"
	I1124 08:55:46.638100       1 server.go:527] "Version info" version="v1.34.2"
	I1124 08:55:46.638127       1 server.go:529] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I1124 08:55:46.640035       1 config.go:200] "Starting service config controller"
	I1124 08:55:46.640207       1 shared_informer.go:349] "Waiting for caches to sync" controller="service config"
	I1124 08:55:46.640350       1 config.go:106] "Starting endpoint slice config controller"
	I1124 08:55:46.640418       1 shared_informer.go:349] "Waiting for caches to sync" controller="endpoint slice config"
	I1124 08:55:46.640509       1 config.go:403] "Starting serviceCIDR config controller"
	I1124 08:55:46.640794       1 shared_informer.go:349] "Waiting for caches to sync" controller="serviceCIDR config"
	I1124 08:55:46.647186       1 config.go:309] "Starting node config controller"
	I1124 08:55:46.647270       1 shared_informer.go:349] "Waiting for caches to sync" controller="node config"
	I1124 08:55:46.647299       1 shared_informer.go:356] "Caches are synced" controller="node config"
	I1124 08:55:46.741106       1 shared_informer.go:356] "Caches are synced" controller="serviceCIDR config"
	I1124 08:55:46.741112       1 shared_informer.go:356] "Caches are synced" controller="service config"
	I1124 08:55:46.741150       1 shared_informer.go:356] "Caches are synced" controller="endpoint slice config"
	
	
	==> kube-scheduler [39dbd124ad9131afd2e38d1cc6019bc4e02106f43653bf39b293c3e257811124] <==
	E1124 08:54:08.898071       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ResourceClaim: resourceclaims.resource.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"resourceclaims\" in API group \"resource.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ResourceClaim"
	E1124 08:54:08.898417       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:kube-scheduler\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service"
	E1124 08:54:08.898585       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User \"system:kube-scheduler\" cannot list resource \"persistentvolumes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolume"
	E1124 08:54:08.898858       1 reflector.go:205] "Failed to watch" err="failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User \"system:kube-scheduler\" cannot list resource \"statefulsets\" in API group \"apps\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.StatefulSet"
	E1124 08:54:08.898950       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver"
	E1124 08:54:08.899058       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User \"system:kube-scheduler\" cannot list resource \"replicationcontrollers\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ReplicationController"
	E1124 08:54:08.899111       1 reflector.go:205] "Failed to watch" err="failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"storageclasses\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.StorageClass"
	E1124 08:54:09.739885       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Namespace: namespaces is forbidden: User \"system:kube-scheduler\" cannot list resource \"namespaces\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Namespace"
	E1124 08:54:09.749915       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User \"system:kube-scheduler\" cannot list resource \"replicasets\" in API group \"apps\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ReplicaSet"
	E1124 08:54:09.907520       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User \"system:kube-scheduler\" cannot list resource \"persistentvolumes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolume"
	E1124 08:54:09.970254       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User \"system:kube-scheduler\" cannot list resource \"persistentvolumeclaims\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolumeClaim"
	E1124 08:54:10.012374       1 reflector.go:205] "Failed to watch" err="failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"storageclasses\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.StorageClass"
	E1124 08:54:10.028349       1 reflector.go:205] "Failed to watch" err="failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User \"system:kube-scheduler\" cannot list resource \"statefulsets\" in API group \"apps\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.StatefulSet"
	E1124 08:54:10.089620       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Pod: pods is forbidden: User \"system:kube-scheduler\" cannot list resource \"pods\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Pod"
	E1124 08:54:10.145424       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver"
	E1124 08:54:10.145452       1 reflector.go:205] "Failed to watch" err="failed to list *v1.DeviceClass: deviceclasses.resource.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"deviceclasses\" in API group \"resource.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.DeviceClass"
	E1124 08:54:10.166448       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User \"system:kube-scheduler\" cannot list resource \"replicationcontrollers\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ReplicationController"
	E1124 08:54:10.168975       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:kube-scheduler\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service"
	E1124 08:54:10.472917       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"extension-apiserver-authentication\" is forbidden: User \"system:kube-scheduler\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"kube-system\"" logger="UnhandledError" reflector="runtime/asm_arm64.s:1223" type="*v1.ConfigMap"
	I1124 08:54:12.282734       1 shared_informer.go:356] "Caches are synced" controller="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I1124 08:55:28.427975       1 secure_serving.go:259] Stopped listening on 127.0.0.1:10259
	I1124 08:55:28.428004       1 server.go:263] "[graceful-termination] secure server has stopped listening"
	I1124 08:55:28.428018       1 configmap_cafile_content.go:226] "Shutting down controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I1124 08:55:28.428088       1 server.go:265] "[graceful-termination] secure server is exiting"
	E1124 08:55:28.428102       1 run.go:72] "command failed" err="finished without leader elect"
	
	
	==> kube-scheduler [4db14bc0f3747bb08ccdf3a4232220b42a95e683884234f9fff57a09e95b88c3] <==
	E1124 08:55:36.598347       1 reflector.go:205] "Failed to watch" err="failed to list *v1.VolumeAttachment: Get \"https://192.168.49.2:8441/apis/storage.k8s.io/v1/volumeattachments?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.VolumeAttachment"
	E1124 08:55:36.838482       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PodDisruptionBudget: Get \"https://192.168.49.2:8441/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PodDisruptionBudget"
	E1124 08:55:37.242275       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://192.168.49.2:8441/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Namespace"
	E1124 08:55:37.413389       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://192.168.49.2:8441/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver"
	E1124 08:55:37.515451       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://192.168.49.2:8441/api/v1/services?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service"
	E1124 08:55:37.711691       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Pod: Get \"https://192.168.49.2:8441/api/v1/pods?fieldSelector=status.phase%21%3DSucceeded%2Cstatus.phase%21%3DFailed&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Pod"
	E1124 08:55:38.082012       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ConfigMap: Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/configmaps?fieldSelector=metadata.name%3Dextension-apiserver-authentication&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="runtime/asm_arm64.s:1223" type="*v1.ConfigMap"
	E1124 08:55:38.154030       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ResourceClaim: Get \"https://192.168.49.2:8441/apis/resource.k8s.io/v1/resourceclaims?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ResourceClaim"
	E1124 08:55:38.280385       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ResourceSlice: Get \"https://192.168.49.2:8441/apis/resource.k8s.io/v1/resourceslices?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ResourceSlice"
	E1124 08:55:38.363163       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSINode: Get \"https://192.168.49.2:8441/apis/storage.k8s.io/v1/csinodes?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSINode"
	E1124 08:55:38.536440       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://192.168.49.2:8441/api/v1/nodes?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1124 08:55:38.785077       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIStorageCapacity: Get \"https://192.168.49.2:8441/apis/storage.k8s.io/v1/csistoragecapacities?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIStorageCapacity"
	E1124 08:55:38.930056       1 reflector.go:205] "Failed to watch" err="failed to list *v1.DeviceClass: Get \"https://192.168.49.2:8441/apis/resource.k8s.io/v1/deviceclasses?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.DeviceClass"
	E1124 08:55:38.985957       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolumeClaim: Get \"https://192.168.49.2:8441/api/v1/persistentvolumeclaims?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolumeClaim"
	E1124 08:55:39.039078       1 reflector.go:205] "Failed to watch" err="failed to list *v1.StatefulSet: Get \"https://192.168.49.2:8441/apis/apps/v1/statefulsets?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.StatefulSet"
	E1124 08:55:39.348953       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ReplicaSet: Get \"https://192.168.49.2:8441/apis/apps/v1/replicasets?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ReplicaSet"
	E1124 08:55:39.714746       1 reflector.go:205] "Failed to watch" err="failed to list *v1.StorageClass: Get \"https://192.168.49.2:8441/apis/storage.k8s.io/v1/storageclasses?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.StorageClass"
	E1124 08:55:39.840410       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolume: Get \"https://192.168.49.2:8441/api/v1/persistentvolumes?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolume"
	E1124 08:55:46.556944       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ResourceClaim: resourceclaims.resource.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"resourceclaims\" in API group \"resource.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ResourceClaim"
	E1124 08:55:46.557177       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User \"system:kube-scheduler\" cannot list resource \"replicationcontrollers\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ReplicationController"
	E1124 08:55:46.557352       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Pod: pods is forbidden: User \"system:kube-scheduler\" cannot list resource \"pods\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Pod"
	E1124 08:55:46.557551       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PodDisruptionBudget: poddisruptionbudgets.policy is forbidden: User \"system:kube-scheduler\" cannot list resource \"poddisruptionbudgets\" in API group \"policy\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PodDisruptionBudget"
	E1124 08:55:46.557775       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver"
	E1124 08:55:46.558034       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIStorageCapacity: csistoragecapacities.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csistoragecapacities\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIStorageCapacity"
	I1124 08:55:47.832870       1 shared_informer.go:356] "Caches are synced" controller="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	
	
	==> kubelet <==
	Nov 24 09:06:36 functional-941011 kubelet[4469]:         toomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit
	Nov 24 09:06:36 functional-941011 kubelet[4469]:  > image="kicbase/echo-server:latest"
	Nov 24 09:06:36 functional-941011 kubelet[4469]: E1124 09:06:36.897717    4469 kuberuntime_image.go:43] "Failed to pull image" err=<
	Nov 24 09:06:36 functional-941011 kubelet[4469]:         failed to pull and unpack image "docker.io/kicbase/echo-server:latest": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kicbase/echo-server/manifests/sha256:42a89d9b22e5307cb88494990d5d929c401339f508c0a7e98a4d8ac52623fc5b: 429 Too Many Requests
	Nov 24 09:06:36 functional-941011 kubelet[4469]:         toomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit
	Nov 24 09:06:36 functional-941011 kubelet[4469]:  > image="kicbase/echo-server:latest"
	Nov 24 09:06:36 functional-941011 kubelet[4469]: E1124 09:06:36.898241    4469 kuberuntime_manager.go:1449] "Unhandled Error" err=<
	Nov 24 09:06:36 functional-941011 kubelet[4469]:         container echo-server start failed in pod hello-node-connect-7d85dfc575-vjgqt_default(672307bf-173d-40e2-b7c4-0b44c2cea44f): ErrImagePull: failed to pull and unpack image "docker.io/kicbase/echo-server:latest": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kicbase/echo-server/manifests/sha256:42a89d9b22e5307cb88494990d5d929c401339f508c0a7e98a4d8ac52623fc5b: 429 Too Many Requests
	Nov 24 09:06:36 functional-941011 kubelet[4469]:         toomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit
	Nov 24 09:06:36 functional-941011 kubelet[4469]:  > logger="UnhandledError"
	Nov 24 09:06:36 functional-941011 kubelet[4469]: E1124 09:06:36.898302    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"echo-server\" with ErrImagePull: \"failed to pull and unpack image \\\"docker.io/kicbase/echo-server:latest\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kicbase/echo-server/manifests/sha256:42a89d9b22e5307cb88494990d5d929c401339f508c0a7e98a4d8ac52623fc5b: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/hello-node-connect-7d85dfc575-vjgqt" podUID="672307bf-173d-40e2-b7c4-0b44c2cea44f"
	Nov 24 09:06:41 functional-941011 kubelet[4469]: E1124 09:06:41.371304    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nginx\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/nginx:alpine\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/library/nginx:alpine\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:b3c656d55d7ad751196f21b7fd2e8d4da9cb430e32f646adcf92441b72f82b14: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/nginx-svc" podUID="d266cb95-cc24-4a91-a764-df9ddabaf208"
	Nov 24 09:06:43 functional-941011 kubelet[4469]: E1124 09:06:43.370322    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"myfrontend\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/nginx\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/library/nginx:latest\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:553f64aecdc31b5bf944521731cd70e35da4faed96b2b7548a3d8e2598c52a42: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/sp-pod" podUID="8483abe7-6e51-4bf9-9b52-141abe46cd3e"
	Nov 24 09:06:44 functional-941011 kubelet[4469]: E1124 09:06:44.370775    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dashboard-metrics-scraper\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/kubernetesui/metrics-scraper:v1.0.8@sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/kubernetesui/metrics-scraper@sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kubernetesui/metrics-scraper/manifests/sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="kubernetes-dashboard/dashboard-metrics-scraper-77bf4d6c4c-5qb25" podUID="cd7638bf-35f8-43ed-a194-705cf
32ed1fc"
	Nov 24 09:06:44 functional-941011 kubelet[4469]: E1124 09:06:44.372205    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubernetes-dashboard\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/kubernetesui/dashboard:v2.7.0@sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/kubernetesui/dashboard@sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kubernetesui/dashboard/manifests/sha256:5c52c60663b473628bd98e4ffee7a747ef1f88d8c7bcee957b089fb3f61bdedf: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="kubernetes-dashboard/kubernetes-dashboard-855c9754f9-tww5k" podUID="62e0e2ab-361c-43b1-b518-50508c7157ce"
	Nov 24 09:06:48 functional-941011 kubelet[4469]: E1124 09:06:48.370527    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"echo-server\" with ImagePullBackOff: \"Back-off pulling image \\\"kicbase/echo-server\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/kicbase/echo-server:latest\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kicbase/echo-server/manifests/sha256:42a89d9b22e5307cb88494990d5d929c401339f508c0a7e98a4d8ac52623fc5b: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/hello-node-connect-7d85dfc575-vjgqt" podUID="672307bf-173d-40e2-b7c4-0b44c2cea44f"
	Nov 24 09:06:52 functional-941011 kubelet[4469]: E1124 09:06:52.371854    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nginx\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/nginx:alpine\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/library/nginx:alpine\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:b3c656d55d7ad751196f21b7fd2e8d4da9cb430e32f646adcf92441b72f82b14: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/nginx-svc" podUID="d266cb95-cc24-4a91-a764-df9ddabaf208"
	Nov 24 09:06:56 functional-941011 kubelet[4469]: E1124 09:06:56.371648    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"myfrontend\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/nginx\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/library/nginx:latest\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:553f64aecdc31b5bf944521731cd70e35da4faed96b2b7548a3d8e2598c52a42: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/sp-pod" podUID="8483abe7-6e51-4bf9-9b52-141abe46cd3e"
	Nov 24 09:06:57 functional-941011 kubelet[4469]: E1124 09:06:57.371154    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dashboard-metrics-scraper\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/kubernetesui/metrics-scraper:v1.0.8@sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/kubernetesui/metrics-scraper@sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kubernetesui/metrics-scraper/manifests/sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="kubernetes-dashboard/dashboard-metrics-scraper-77bf4d6c4c-5qb25" podUID="cd7638bf-35f8-43ed-a194-705cf
32ed1fc"
	Nov 24 09:06:58 functional-941011 kubelet[4469]: E1124 09:06:58.370851    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubernetes-dashboard\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/kubernetesui/dashboard:v2.7.0@sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/kubernetesui/dashboard@sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kubernetesui/dashboard/manifests/sha256:5c52c60663b473628bd98e4ffee7a747ef1f88d8c7bcee957b089fb3f61bdedf: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="kubernetes-dashboard/kubernetes-dashboard-855c9754f9-tww5k" podUID="62e0e2ab-361c-43b1-b518-50508c7157ce"
	Nov 24 09:07:02 functional-941011 kubelet[4469]: E1124 09:07:02.371238    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"echo-server\" with ImagePullBackOff: \"Back-off pulling image \\\"kicbase/echo-server\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/kicbase/echo-server:latest\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kicbase/echo-server/manifests/sha256:42a89d9b22e5307cb88494990d5d929c401339f508c0a7e98a4d8ac52623fc5b: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/hello-node-connect-7d85dfc575-vjgqt" podUID="672307bf-173d-40e2-b7c4-0b44c2cea44f"
	Nov 24 09:07:06 functional-941011 kubelet[4469]: E1124 09:07:06.370808    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nginx\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/nginx:alpine\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/library/nginx:alpine\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:b3c656d55d7ad751196f21b7fd2e8d4da9cb430e32f646adcf92441b72f82b14: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/nginx-svc" podUID="d266cb95-cc24-4a91-a764-df9ddabaf208"
	Nov 24 09:07:09 functional-941011 kubelet[4469]: E1124 09:07:09.370873    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubernetes-dashboard\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/kubernetesui/dashboard:v2.7.0@sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/kubernetesui/dashboard@sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kubernetesui/dashboard/manifests/sha256:5c52c60663b473628bd98e4ffee7a747ef1f88d8c7bcee957b089fb3f61bdedf: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="kubernetes-dashboard/kubernetes-dashboard-855c9754f9-tww5k" podUID="62e0e2ab-361c-43b1-b518-50508c7157ce"
	Nov 24 09:07:10 functional-941011 kubelet[4469]: E1124 09:07:10.370751    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dashboard-metrics-scraper\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/kubernetesui/metrics-scraper:v1.0.8@sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/kubernetesui/metrics-scraper@sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kubernetesui/metrics-scraper/manifests/sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="kubernetes-dashboard/dashboard-metrics-scraper-77bf4d6c4c-5qb25" podUID="cd7638bf-35f8-43ed-a194-705cf
32ed1fc"
	Nov 24 09:07:11 functional-941011 kubelet[4469]: E1124 09:07:11.370284    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"myfrontend\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/nginx\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/library/nginx:latest\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:553f64aecdc31b5bf944521731cd70e35da4faed96b2b7548a3d8e2598c52a42: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/sp-pod" podUID="8483abe7-6e51-4bf9-9b52-141abe46cd3e"
	
	
	==> storage-provisioner [b6a84160fb5a462413dc19bd1857be8f2613f26a584ad28f2b8052ac97712585] <==
	I1124 08:55:29.097755       1 storage_provisioner.go:116] Initializing the minikube storage provisioner...
	F1124 08:55:29.099748       1 main.go:39] error getting server version: Get "https://10.96.0.1:443/version?timeout=32s": dial tcp 10.96.0.1:443: connect: connection refused
	
	
	==> storage-provisioner [e068a2929b9a7acaf417417b25ba4835b5824322af5bcbddaa3f41f7d5cb8575] <==
	W1124 09:06:48.339173       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:06:50.342335       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:06:50.348945       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:06:52.353009       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:06:52.357694       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:06:54.360881       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:06:54.367793       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:06:56.372859       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:06:56.378429       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:06:58.381961       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:06:58.386715       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:07:00.395479       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:07:00.404942       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:07:02.408624       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:07:02.413917       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:07:04.417574       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:07:04.422523       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:07:06.425803       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:07:06.432835       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:07:08.436049       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:07:08.440870       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:07:10.444319       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:07:10.448998       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:07:12.452555       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:07:12.458554       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-941011 -n functional-941011
helpers_test.go:269: (dbg) Run:  kubectl --context functional-941011 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:280: non-running pods: busybox-mount hello-node-connect-7d85dfc575-vjgqt nginx-svc sp-pod dashboard-metrics-scraper-77bf4d6c4c-5qb25 kubernetes-dashboard-855c9754f9-tww5k
helpers_test.go:282: ======> post-mortem[TestFunctional/parallel/DashboardCmd]: describe non-running pods <======
helpers_test.go:285: (dbg) Run:  kubectl --context functional-941011 describe pod busybox-mount hello-node-connect-7d85dfc575-vjgqt nginx-svc sp-pod dashboard-metrics-scraper-77bf4d6c4c-5qb25 kubernetes-dashboard-855c9754f9-tww5k
helpers_test.go:285: (dbg) Non-zero exit: kubectl --context functional-941011 describe pod busybox-mount hello-node-connect-7d85dfc575-vjgqt nginx-svc sp-pod dashboard-metrics-scraper-77bf4d6c4c-5qb25 kubernetes-dashboard-855c9754f9-tww5k: exit status 1 (134.864653ms)

                                                
                                                
-- stdout --
	Name:             busybox-mount
	Namespace:        default
	Priority:         0
	Service Account:  default
	Node:             functional-941011/192.168.49.2
	Start Time:       Mon, 24 Nov 2025 09:02:01 +0000
	Labels:           integration-test=busybox-mount
	Annotations:      <none>
	Status:           Succeeded
	IP:               10.244.0.8
	IPs:
	  IP:  10.244.0.8
	Containers:
	  mount-munger:
	    Container ID:  containerd://423ebfbd588615eb4bc213b5f471b5301692b734e90d6664d802a7debc9f3474
	    Image:         gcr.io/k8s-minikube/busybox:1.28.4-glibc
	    Image ID:      gcr.io/k8s-minikube/busybox@sha256:2d03e6ceeb99250061dd110530b0ece7998cd84121f952adef120ea7c5a6f00e
	    Port:          <none>
	    Host Port:     <none>
	    Command:
	      /bin/sh
	      -c
	      --
	    Args:
	      cat /mount-9p/created-by-test; echo test > /mount-9p/created-by-pod; rm /mount-9p/created-by-test-removed-by-pod; echo test > /mount-9p/created-by-pod-removed-by-test date >> /mount-9p/pod-dates
	    State:          Terminated
	      Reason:       Completed
	      Exit Code:    0
	      Started:      Mon, 24 Nov 2025 09:02:03 +0000
	      Finished:     Mon, 24 Nov 2025 09:02:03 +0000
	    Ready:          False
	    Restart Count:  0
	    Environment:    <none>
	    Mounts:
	      /mount-9p from test-volume (rw)
	      /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-6f7z2 (ro)
	Conditions:
	  Type                        Status
	  PodReadyToStartContainers   False 
	  Initialized                 True 
	  Ready                       False 
	  ContainersReady             False 
	  PodScheduled                True 
	Volumes:
	  test-volume:
	    Type:          HostPath (bare host directory volume)
	    Path:          /mount-9p
	    HostPathType:  
	  kube-api-access-6f7z2:
	    Type:                    Projected (a volume that contains injected data from multiple sources)
	    TokenExpirationSeconds:  3607
	    ConfigMapName:           kube-root-ca.crt
	    Optional:                false
	    DownwardAPI:             true
	QoS Class:                   BestEffort
	Node-Selectors:              <none>
	Tolerations:                 node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
	                             node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
	Events:
	  Type    Reason     Age    From               Message
	  ----    ------     ----   ----               -------
	  Normal  Scheduled  5m12s  default-scheduler  Successfully assigned default/busybox-mount to functional-941011
	  Normal  Pulling    5m12s  kubelet            Pulling image "gcr.io/k8s-minikube/busybox:1.28.4-glibc"
	  Normal  Pulled     5m10s  kubelet            Successfully pulled image "gcr.io/k8s-minikube/busybox:1.28.4-glibc" in 2.061s (2.062s including waiting). Image size: 1935750 bytes.
	  Normal  Created    5m10s  kubelet            Created container: mount-munger
	  Normal  Started    5m10s  kubelet            Started container mount-munger
	
	
	Name:             hello-node-connect-7d85dfc575-vjgqt
	Namespace:        default
	Priority:         0
	Service Account:  default
	Node:             functional-941011/192.168.49.2
	Start Time:       Mon, 24 Nov 2025 09:00:44 +0000
	Labels:           app=hello-node-connect
	                  pod-template-hash=7d85dfc575
	Annotations:      <none>
	Status:           Pending
	IP:               10.244.0.7
	IPs:
	  IP:           10.244.0.7
	Controlled By:  ReplicaSet/hello-node-connect-7d85dfc575
	Containers:
	  echo-server:
	    Container ID:   
	    Image:          kicbase/echo-server
	    Image ID:       
	    Port:           <none>
	    Host Port:      <none>
	    State:          Waiting
	      Reason:       ImagePullBackOff
	    Ready:          False
	    Restart Count:  0
	    Environment:    <none>
	    Mounts:
	      /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-8n6nc (ro)
	Conditions:
	  Type                        Status
	  PodReadyToStartContainers   True 
	  Initialized                 True 
	  Ready                       False 
	  ContainersReady             False 
	  PodScheduled                True 
	Volumes:
	  kube-api-access-8n6nc:
	    Type:                    Projected (a volume that contains injected data from multiple sources)
	    TokenExpirationSeconds:  3607
	    ConfigMapName:           kube-root-ca.crt
	    Optional:                false
	    DownwardAPI:             true
	QoS Class:                   BestEffort
	Node-Selectors:              <none>
	Tolerations:                 node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
	                             node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
	Events:
	  Type     Reason     Age    From               Message
	  ----     ------     ----   ----               -------
	  Normal   Scheduled  6m29s  default-scheduler  Successfully assigned default/hello-node-connect-7d85dfc575-vjgqt to functional-941011
	  Warning  Failed     6m28s  kubelet            Failed to pull image "kicbase/echo-server": failed to pull and unpack image "docker.io/kicbase/echo-server:latest": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kicbase/echo-server/manifests/sha256:42a89d9b22e5307cb88494990d5d929c401339f508c0a7e98a4d8ac52623fc5b: 429 Too Many Requests
	toomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit
	  Normal   Pulling  3m26s (x5 over 6m29s)  kubelet  Pulling image "kicbase/echo-server"
	  Warning  Failed   3m26s (x5 over 6m28s)  kubelet  Error: ErrImagePull
	  Warning  Failed   3m26s (x4 over 6m14s)  kubelet  Failed to pull image "kicbase/echo-server": failed to pull and unpack image "docker.io/kicbase/echo-server:latest": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kicbase/echo-server/manifests/sha256:127ac38a2bb9537b7f252addff209ea6801edcac8a92c8b1104dacd66a583ed6: 429 Too Many Requests
	toomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit
	  Normal   BackOff  78s (x21 over 6m27s)  kubelet  Back-off pulling image "kicbase/echo-server"
	  Warning  Failed   78s (x21 over 6m27s)  kubelet  Error: ImagePullBackOff
	
	
	Name:             nginx-svc
	Namespace:        default
	Priority:         0
	Service Account:  default
	Node:             functional-941011/192.168.49.2
	Start Time:       Mon, 24 Nov 2025 08:56:29 +0000
	Labels:           run=nginx-svc
	Annotations:      <none>
	Status:           Pending
	IP:               10.244.0.5
	IPs:
	  IP:  10.244.0.5
	Containers:
	  nginx:
	    Container ID:   
	    Image:          docker.io/nginx:alpine
	    Image ID:       
	    Port:           80/TCP
	    Host Port:      0/TCP
	    State:          Waiting
	      Reason:       ImagePullBackOff
	    Ready:          False
	    Restart Count:  0
	    Environment:    <none>
	    Mounts:
	      /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-qt757 (ro)
	Conditions:
	  Type                        Status
	  PodReadyToStartContainers   True 
	  Initialized                 True 
	  Ready                       False 
	  ContainersReady             False 
	  PodScheduled                True 
	Volumes:
	  kube-api-access-qt757:
	    Type:                    Projected (a volume that contains injected data from multiple sources)
	    TokenExpirationSeconds:  3607
	    ConfigMapName:           kube-root-ca.crt
	    Optional:                false
	    DownwardAPI:             true
	QoS Class:                   BestEffort
	Node-Selectors:              <none>
	Tolerations:                 node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
	                             node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
	Events:
	  Type     Reason     Age                  From               Message
	  ----     ------     ----                 ----               -------
	  Normal   Scheduled  10m                  default-scheduler  Successfully assigned default/nginx-svc to functional-941011
	  Warning  Failed     9m14s (x3 over 10m)  kubelet            Failed to pull image "docker.io/nginx:alpine": failed to pull and unpack image "docker.io/library/nginx:alpine": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:b3c656d55d7ad751196f21b7fd2e8d4da9cb430e32f646adcf92441b72f82b14: 429 Too Many Requests
	toomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit
	  Normal   Pulling  7m51s (x5 over 10m)  kubelet  Pulling image "docker.io/nginx:alpine"
	  Warning  Failed   7m51s (x5 over 10m)  kubelet  Error: ErrImagePull
	  Warning  Failed   7m51s (x2 over 10m)  kubelet  Failed to pull image "docker.io/nginx:alpine": failed to pull and unpack image "docker.io/library/nginx:alpine": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:7391b3732e7f7ccd23ff1d02fbeadcde496f374d7460ad8e79260f8f6d2c9f90: 429 Too Many Requests
	toomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit
	  Normal   BackOff  32s (x42 over 10m)  kubelet  Back-off pulling image "docker.io/nginx:alpine"
	  Warning  Failed   32s (x42 over 10m)  kubelet  Error: ImagePullBackOff
	
	
	Name:             sp-pod
	Namespace:        default
	Priority:         0
	Service Account:  default
	Node:             functional-941011/192.168.49.2
	Start Time:       Mon, 24 Nov 2025 08:56:41 +0000
	Labels:           test=storage-provisioner
	Annotations:      <none>
	Status:           Pending
	IP:               10.244.0.6
	IPs:
	  IP:  10.244.0.6
	Containers:
	  myfrontend:
	    Container ID:   
	    Image:          docker.io/nginx
	    Image ID:       
	    Port:           <none>
	    Host Port:      <none>
	    State:          Waiting
	      Reason:       ImagePullBackOff
	    Ready:          False
	    Restart Count:  0
	    Environment:    <none>
	    Mounts:
	      /tmp/mount from mypd (rw)
	      /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-26mp7 (ro)
	Conditions:
	  Type                        Status
	  PodReadyToStartContainers   True 
	  Initialized                 True 
	  Ready                       False 
	  ContainersReady             False 
	  PodScheduled                True 
	Volumes:
	  mypd:
	    Type:       PersistentVolumeClaim (a reference to a PersistentVolumeClaim in the same namespace)
	    ClaimName:  myclaim
	    ReadOnly:   false
	  kube-api-access-26mp7:
	    Type:                    Projected (a volume that contains injected data from multiple sources)
	    TokenExpirationSeconds:  3607
	    ConfigMapName:           kube-root-ca.crt
	    Optional:                false
	    DownwardAPI:             true
	QoS Class:                   BestEffort
	Node-Selectors:              <none>
	Tolerations:                 node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
	                             node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
	Events:
	  Type     Reason     Age                  From               Message
	  ----     ------     ----                 ----               -------
	  Normal   Scheduled  10m                  default-scheduler  Successfully assigned default/sp-pod to functional-941011
	  Normal   Pulling    7m36s (x5 over 10m)  kubelet            Pulling image "docker.io/nginx"
	  Warning  Failed     7m36s (x5 over 10m)  kubelet            Failed to pull image "docker.io/nginx": failed to pull and unpack image "docker.io/library/nginx:latest": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:553f64aecdc31b5bf944521731cd70e35da4faed96b2b7548a3d8e2598c52a42: 429 Too Many Requests
	toomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit
	  Warning  Failed   7m36s (x5 over 10m)  kubelet  Error: ErrImagePull
	  Normal   BackOff  30s (x42 over 10m)   kubelet  Back-off pulling image "docker.io/nginx"
	  Warning  Failed   30s (x42 over 10m)   kubelet  Error: ImagePullBackOff

                                                
                                                
-- /stdout --
** stderr ** 
	Error from server (NotFound): pods "dashboard-metrics-scraper-77bf4d6c4c-5qb25" not found
	Error from server (NotFound): pods "kubernetes-dashboard-855c9754f9-tww5k" not found

                                                
                                                
** /stderr **
helpers_test.go:287: kubectl --context functional-941011 describe pod busybox-mount hello-node-connect-7d85dfc575-vjgqt nginx-svc sp-pod dashboard-metrics-scraper-77bf4d6c4c-5qb25 kubernetes-dashboard-855c9754f9-tww5k: exit status 1
--- FAIL: TestFunctional/parallel/DashboardCmd (302.68s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmdConnect (603.67s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmdConnect
=== PAUSE TestFunctional/parallel/ServiceCmdConnect

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ServiceCmdConnect
functional_test.go:1636: (dbg) Run:  kubectl --context functional-941011 create deployment hello-node-connect --image kicbase/echo-server
functional_test.go:1640: (dbg) Run:  kubectl --context functional-941011 expose deployment hello-node-connect --type=NodePort --port=8080
functional_test.go:1645: (dbg) TestFunctional/parallel/ServiceCmdConnect: waiting 10m0s for pods matching "app=hello-node-connect" in namespace "default" ...
helpers_test.go:352: "hello-node-connect-7d85dfc575-vjgqt" [672307bf-173d-40e2-b7c4-0b44c2cea44f] Pending / Ready:ContainersNotReady (containers with unready status: [echo-server]) / ContainersReady:ContainersNotReady (containers with unready status: [echo-server])
I1124 09:00:46.324532 1654467 retry.go:31] will retry after 7.788835545s: Temporary Error: Get "http:": http: no Host in request URL
I1124 09:00:54.113601 1654467 retry.go:31] will retry after 8.805759885s: Temporary Error: Get "http:": http: no Host in request URL
I1124 09:01:02.920280 1654467 retry.go:31] will retry after 23.703830737s: Temporary Error: Get "http:": http: no Host in request URL
E1124 09:01:03.604289 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
I1124 09:01:26.624286 1654467 retry.go:31] will retry after 30.539720281s: Temporary Error: Get "http:": http: no Host in request URL
E1124 09:01:31.311597 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:337: TestFunctional/parallel/ServiceCmdConnect: WARNING: pod list for "default" "app=hello-node-connect" returned: client rate limiter Wait returned an error: context deadline exceeded
functional_test.go:1645: ***** TestFunctional/parallel/ServiceCmdConnect: pod "app=hello-node-connect" failed to start within 10m0s: context deadline exceeded ****
functional_test.go:1645: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-941011 -n functional-941011
functional_test.go:1645: TestFunctional/parallel/ServiceCmdConnect: showing logs for failed pods as of 2025-11-24 09:10:44.97805256 +0000 UTC m=+1660.345359090
functional_test.go:1645: (dbg) Run:  kubectl --context functional-941011 describe po hello-node-connect-7d85dfc575-vjgqt -n default
functional_test.go:1645: (dbg) kubectl --context functional-941011 describe po hello-node-connect-7d85dfc575-vjgqt -n default:
Name:             hello-node-connect-7d85dfc575-vjgqt
Namespace:        default
Priority:         0
Service Account:  default
Node:             functional-941011/192.168.49.2
Start Time:       Mon, 24 Nov 2025 09:00:44 +0000
Labels:           app=hello-node-connect
pod-template-hash=7d85dfc575
Annotations:      <none>
Status:           Pending
IP:               10.244.0.7
IPs:
IP:           10.244.0.7
Controlled By:  ReplicaSet/hello-node-connect-7d85dfc575
Containers:
echo-server:
Container ID:   
Image:          kicbase/echo-server
Image ID:       
Port:           <none>
Host Port:      <none>
State:          Waiting
Reason:       ImagePullBackOff
Ready:          False
Restart Count:  0
Environment:    <none>
Mounts:
/var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-8n6nc (ro)
Conditions:
Type                        Status
PodReadyToStartContainers   True 
Initialized                 True 
Ready                       False 
ContainersReady             False 
PodScheduled                True 
Volumes:
kube-api-access-8n6nc:
Type:                    Projected (a volume that contains injected data from multiple sources)
TokenExpirationSeconds:  3607
ConfigMapName:           kube-root-ca.crt
Optional:                false
DownwardAPI:             true
QoS Class:                   BestEffort
Node-Selectors:              <none>
Tolerations:                 node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
Events:
Type     Reason     Age   From               Message
----     ------     ----  ----               -------
Normal   Scheduled  10m   default-scheduler  Successfully assigned default/hello-node-connect-7d85dfc575-vjgqt to functional-941011
Warning  Failed     10m   kubelet            Failed to pull image "kicbase/echo-server": failed to pull and unpack image "docker.io/kicbase/echo-server:latest": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kicbase/echo-server/manifests/sha256:42a89d9b22e5307cb88494990d5d929c401339f508c0a7e98a4d8ac52623fc5b: 429 Too Many Requests
toomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit
Normal   Pulling  6m58s (x5 over 10m)    kubelet  Pulling image "kicbase/echo-server"
Warning  Failed   6m58s (x5 over 10m)    kubelet  Error: ErrImagePull
Warning  Failed   6m58s (x4 over 9m46s)  kubelet  Failed to pull image "kicbase/echo-server": failed to pull and unpack image "docker.io/kicbase/echo-server:latest": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kicbase/echo-server/manifests/sha256:127ac38a2bb9537b7f252addff209ea6801edcac8a92c8b1104dacd66a583ed6: 429 Too Many Requests
toomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit
Normal   BackOff  4m50s (x21 over 9m59s)  kubelet  Back-off pulling image "kicbase/echo-server"
Warning  Failed   4m50s (x21 over 9m59s)  kubelet  Error: ImagePullBackOff
functional_test.go:1645: (dbg) Run:  kubectl --context functional-941011 logs hello-node-connect-7d85dfc575-vjgqt -n default
functional_test.go:1645: (dbg) Non-zero exit: kubectl --context functional-941011 logs hello-node-connect-7d85dfc575-vjgqt -n default: exit status 1 (141.311882ms)

                                                
                                                
** stderr ** 
	Error from server (BadRequest): container "echo-server" in pod "hello-node-connect-7d85dfc575-vjgqt" is waiting to start: trying and failing to pull image

                                                
                                                
** /stderr **
functional_test.go:1645: kubectl --context functional-941011 logs hello-node-connect-7d85dfc575-vjgqt -n default: exit status 1
functional_test.go:1646: failed waiting for hello-node pod: app=hello-node-connect within 10m0s: context deadline exceeded
functional_test.go:1608: service test failed - dumping debug information
functional_test.go:1609: -----------------------service failure post-mortem--------------------------------
functional_test.go:1612: (dbg) Run:  kubectl --context functional-941011 describe po hello-node-connect
functional_test.go:1616: hello-node pod describe:
Name:             hello-node-connect-7d85dfc575-vjgqt
Namespace:        default
Priority:         0
Service Account:  default
Node:             functional-941011/192.168.49.2
Start Time:       Mon, 24 Nov 2025 09:00:44 +0000
Labels:           app=hello-node-connect
pod-template-hash=7d85dfc575
Annotations:      <none>
Status:           Pending
IP:               10.244.0.7
IPs:
IP:           10.244.0.7
Controlled By:  ReplicaSet/hello-node-connect-7d85dfc575
Containers:
echo-server:
Container ID:   
Image:          kicbase/echo-server
Image ID:       
Port:           <none>
Host Port:      <none>
State:          Waiting
Reason:       ImagePullBackOff
Ready:          False
Restart Count:  0
Environment:    <none>
Mounts:
/var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-8n6nc (ro)
Conditions:
Type                        Status
PodReadyToStartContainers   True 
Initialized                 True 
Ready                       False 
ContainersReady             False 
PodScheduled                True 
Volumes:
kube-api-access-8n6nc:
Type:                    Projected (a volume that contains injected data from multiple sources)
TokenExpirationSeconds:  3607
ConfigMapName:           kube-root-ca.crt
Optional:                false
DownwardAPI:             true
QoS Class:                   BestEffort
Node-Selectors:              <none>
Tolerations:                 node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
Events:
Type     Reason     Age   From               Message
----     ------     ----  ----               -------
Normal   Scheduled  10m   default-scheduler  Successfully assigned default/hello-node-connect-7d85dfc575-vjgqt to functional-941011
Warning  Failed     10m   kubelet            Failed to pull image "kicbase/echo-server": failed to pull and unpack image "docker.io/kicbase/echo-server:latest": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kicbase/echo-server/manifests/sha256:42a89d9b22e5307cb88494990d5d929c401339f508c0a7e98a4d8ac52623fc5b: 429 Too Many Requests
toomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit
Normal   Pulling  6m58s (x5 over 10m)    kubelet  Pulling image "kicbase/echo-server"
Warning  Failed   6m58s (x5 over 10m)    kubelet  Error: ErrImagePull
Warning  Failed   6m58s (x4 over 9m46s)  kubelet  Failed to pull image "kicbase/echo-server": failed to pull and unpack image "docker.io/kicbase/echo-server:latest": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kicbase/echo-server/manifests/sha256:127ac38a2bb9537b7f252addff209ea6801edcac8a92c8b1104dacd66a583ed6: 429 Too Many Requests
toomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit
Normal   BackOff  4m50s (x21 over 9m59s)  kubelet  Back-off pulling image "kicbase/echo-server"
Warning  Failed   4m50s (x21 over 9m59s)  kubelet  Error: ImagePullBackOff

                                                
                                                
functional_test.go:1618: (dbg) Run:  kubectl --context functional-941011 logs -l app=hello-node-connect
functional_test.go:1618: (dbg) Non-zero exit: kubectl --context functional-941011 logs -l app=hello-node-connect: exit status 1 (86.781186ms)

                                                
                                                
** stderr ** 
	Error from server (BadRequest): container "echo-server" in pod "hello-node-connect-7d85dfc575-vjgqt" is waiting to start: trying and failing to pull image

                                                
                                                
** /stderr **
functional_test.go:1620: "kubectl --context functional-941011 logs -l app=hello-node-connect" failed: exit status 1
functional_test.go:1622: hello-node logs:
functional_test.go:1624: (dbg) Run:  kubectl --context functional-941011 describe svc hello-node-connect
functional_test.go:1628: hello-node svc describe:
Name:                     hello-node-connect
Namespace:                default
Labels:                   app=hello-node-connect
Annotations:              <none>
Selector:                 app=hello-node-connect
Type:                     NodePort
IP Family Policy:         SingleStack
IP Families:              IPv4
IP:                       10.99.193.46
IPs:                      10.99.193.46
Port:                     <unset>  8080/TCP
TargetPort:               8080/TCP
NodePort:                 <unset>  31671/TCP
Endpoints:                
Session Affinity:         None
External Traffic Policy:  Cluster
Internal Traffic Policy:  Cluster
Events:                   <none>
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestFunctional/parallel/ServiceCmdConnect]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestFunctional/parallel/ServiceCmdConnect]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect functional-941011
helpers_test.go:243: (dbg) docker inspect functional-941011:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "d8574c2bf48ce967fe6e255a178ebf105b1a3019c24cf41f2b79f5302c127773",
	        "Created": "2025-11-24T08:53:47.57593314Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 1679494,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-11-24T08:53:47.634288317Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:572c983e466f1f784136812eef5cc59ac623db764bc7704d3676c4643993fd08",
	        "ResolvConfPath": "/var/lib/docker/containers/d8574c2bf48ce967fe6e255a178ebf105b1a3019c24cf41f2b79f5302c127773/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/d8574c2bf48ce967fe6e255a178ebf105b1a3019c24cf41f2b79f5302c127773/hostname",
	        "HostsPath": "/var/lib/docker/containers/d8574c2bf48ce967fe6e255a178ebf105b1a3019c24cf41f2b79f5302c127773/hosts",
	        "LogPath": "/var/lib/docker/containers/d8574c2bf48ce967fe6e255a178ebf105b1a3019c24cf41f2b79f5302c127773/d8574c2bf48ce967fe6e255a178ebf105b1a3019c24cf41f2b79f5302c127773-json.log",
	        "Name": "/functional-941011",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-941011:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-941011",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "d8574c2bf48ce967fe6e255a178ebf105b1a3019c24cf41f2b79f5302c127773",
	                "LowerDir": "/var/lib/docker/overlay2/6eeb35a95c7cafb92e460f903c5570446f5d8f4a8b4814f2aec79043c07ef0d3-init/diff:/var/lib/docker/overlay2/bf294786c7ade59cb761c7bdcc27045362c3779b00a569b685443f34ade17737/diff",
	                "MergedDir": "/var/lib/docker/overlay2/6eeb35a95c7cafb92e460f903c5570446f5d8f4a8b4814f2aec79043c07ef0d3/merged",
	                "UpperDir": "/var/lib/docker/overlay2/6eeb35a95c7cafb92e460f903c5570446f5d8f4a8b4814f2aec79043c07ef0d3/diff",
	                "WorkDir": "/var/lib/docker/overlay2/6eeb35a95c7cafb92e460f903c5570446f5d8f4a8b4814f2aec79043c07ef0d3/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-941011",
	                "Source": "/var/lib/docker/volumes/functional-941011/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-941011",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-941011",
	                "name.minikube.sigs.k8s.io": "functional-941011",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "91cbecc0d651d94558cb202589b12e740389d40de185d06770e23f82cb68fc8d",
	            "SandboxKey": "/var/run/docker/netns/91cbecc0d651",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34679"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34680"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34683"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34681"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34682"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-941011": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "36:03:d7:7f:e5:c7",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "e7f7867899e274c20f652612139490d61ff49918c5fef46ebcab3194d02671b8",
	                    "EndpointID": "ca16f2cc76565150d8b128df549a1bd659112397b50cb5fa5c6631e2b78b03b5",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-941011",
	                        "d8574c2bf48c"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-941011 -n functional-941011
helpers_test.go:252: <<< TestFunctional/parallel/ServiceCmdConnect FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestFunctional/parallel/ServiceCmdConnect]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 logs -n 25
helpers_test.go:255: (dbg) Done: out/minikube-linux-arm64 -p functional-941011 logs -n 25: (1.529597176s)
helpers_test.go:260: TestFunctional/parallel/ServiceCmdConnect logs: 
-- stdout --
	
	==> Audit <==
	┌────────────────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│    COMMAND     │                                                               ARGS                                                                │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├────────────────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ mount          │ -p functional-941011 /tmp/TestFunctionalparallelMountCmdspecific-port1627151296/001:/mount-9p --alsologtostderr -v=1 --port 46464 │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ ssh            │ functional-941011 ssh findmnt -T /mount-9p | grep 9p                                                                              │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │ 24 Nov 25 09:02 UTC │
	│ ssh            │ functional-941011 ssh -- ls -la /mount-9p                                                                                         │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │ 24 Nov 25 09:02 UTC │
	│ ssh            │ functional-941011 ssh sudo umount -f /mount-9p                                                                                    │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ mount          │ -p functional-941011 /tmp/TestFunctionalparallelMountCmdVerifyCleanup1964745822/001:/mount1 --alsologtostderr -v=1                │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ ssh            │ functional-941011 ssh findmnt -T /mount1                                                                                          │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │ 24 Nov 25 09:02 UTC │
	│ mount          │ -p functional-941011 /tmp/TestFunctionalparallelMountCmdVerifyCleanup1964745822/001:/mount2 --alsologtostderr -v=1                │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ mount          │ -p functional-941011 /tmp/TestFunctionalparallelMountCmdVerifyCleanup1964745822/001:/mount3 --alsologtostderr -v=1                │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ ssh            │ functional-941011 ssh findmnt -T /mount2                                                                                          │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │ 24 Nov 25 09:02 UTC │
	│ ssh            │ functional-941011 ssh findmnt -T /mount3                                                                                          │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │ 24 Nov 25 09:02 UTC │
	│ mount          │ -p functional-941011 --kill=true                                                                                                  │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ start          │ -p functional-941011 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd                   │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ start          │ -p functional-941011 --dry-run --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd                             │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ start          │ -p functional-941011 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd                   │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ dashboard      │ --url --port 36195 -p functional-941011 --alsologtostderr -v=1                                                                    │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ update-context │ functional-941011 update-context --alsologtostderr -v=2                                                                           │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ update-context │ functional-941011 update-context --alsologtostderr -v=2                                                                           │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ update-context │ functional-941011 update-context --alsologtostderr -v=2                                                                           │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ image          │ functional-941011 image ls --format short --alsologtostderr                                                                       │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ image          │ functional-941011 image ls --format yaml --alsologtostderr                                                                        │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ ssh            │ functional-941011 ssh pgrep buildkitd                                                                                             │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │                     │
	│ image          │ functional-941011 image build -t localhost/my-image:functional-941011 testdata/build --alsologtostderr                            │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ image          │ functional-941011 image ls                                                                                                        │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ image          │ functional-941011 image ls --format json --alsologtostderr                                                                        │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ image          │ functional-941011 image ls --format table --alsologtostderr                                                                       │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	└────────────────┴───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/11/24 09:02:10
	Running on machine: ip-172-31-30-239
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1124 09:02:10.965378 1692444 out.go:360] Setting OutFile to fd 1 ...
	I1124 09:02:10.965542 1692444 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:02:10.965568 1692444 out.go:374] Setting ErrFile to fd 2...
	I1124 09:02:10.965585 1692444 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:02:10.965987 1692444 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
	I1124 09:02:10.966438 1692444 out.go:368] Setting JSON to false
	I1124 09:02:10.967543 1692444 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":27860,"bootTime":1763947071,"procs":190,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1124 09:02:10.967616 1692444 start.go:143] virtualization:  
	I1124 09:02:10.970861 1692444 out.go:179] * [functional-941011] minikube v1.37.0 sur Ubuntu 20.04 (arm64)
	I1124 09:02:10.974642 1692444 out.go:179]   - MINIKUBE_LOCATION=21978
	I1124 09:02:10.974681 1692444 notify.go:221] Checking for updates...
	I1124 09:02:10.980472 1692444 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1124 09:02:10.983548 1692444 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 09:02:10.986537 1692444 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21978-1652607/.minikube
	I1124 09:02:10.989522 1692444 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1124 09:02:10.992492 1692444 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1124 09:02:10.996012 1692444 config.go:182] Loaded profile config "functional-941011": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
	I1124 09:02:10.999624 1692444 driver.go:422] Setting default libvirt URI to qemu:///system
	I1124 09:02:11.026708 1692444 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1124 09:02:11.026833 1692444 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 09:02:11.090005 1692444 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-11-24 09:02:11.080326924 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 09:02:11.090124 1692444 docker.go:319] overlay module found
	I1124 09:02:11.093275 1692444 out.go:179] * Utilisation du pilote docker basé sur le profil existant
	I1124 09:02:11.095997 1692444 start.go:309] selected driver: docker
	I1124 09:02:11.096021 1692444 start.go:927] validating driver "docker" against &{Name:functional-941011 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.2 ClusterName:functional-941011 Namespace:default APIServerHAVIP: APIServerNa
me:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOpt
ions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:02:11.096132 1692444 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1124 09:02:11.099708 1692444 out.go:203] 
	W1124 09:02:11.102517 1692444 out.go:285] X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	I1124 09:02:11.105331 1692444 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                      ATTEMPT             POD ID              POD                                         NAMESPACE
	423ebfbd58861       1611cd07b61d5       8 minutes ago       Exited              mount-munger              0                   acb67d9acb0da       busybox-mount                               default
	85c0699fd2b2d       ce2d2cda2d858       14 minutes ago      Running             echo-server               0                   8f38d234a4a08       hello-node-75c85bcc94-srggf                 default
	e068a2929b9a7       ba04bb24b9575       14 minutes ago      Running             storage-provisioner       2                   1d6d04a32577c       storage-provisioner                         kube-system
	12cb72b6be32a       b178af3d91f80       15 minutes ago      Running             kube-apiserver            0                   43e43c411fcfa       kube-apiserver-functional-941011            kube-system
	bb12959193539       1b34917560f09       15 minutes ago      Running             kube-controller-manager   1                   5f4eb46413316       kube-controller-manager-functional-941011   kube-system
	f631f089f7db7       2c5f0dedd21c2       15 minutes ago      Running             etcd                      1                   abc7abcd5ce10       etcd-functional-941011                      kube-system
	4db14bc0f3747       4f982e73e768a       15 minutes ago      Running             kube-scheduler            1                   0fac3868adf86       kube-scheduler-functional-941011            kube-system
	b6a84160fb5a4       ba04bb24b9575       15 minutes ago      Exited              storage-provisioner       1                   1d6d04a32577c       storage-provisioner                         kube-system
	fc6fc133ebd24       138784d87c9c5       15 minutes ago      Running             coredns                   1                   b1743c0f4c096       coredns-66bc5c9577-slkfz                    kube-system
	73cbc10088bae       94bff1bec29fd       15 minutes ago      Running             kube-proxy                1                   31e96b363c821       kube-proxy-kmdsq                            kube-system
	c6aa1ae3a9097       b1a8c6f707935       15 minutes ago      Running             kindnet-cni               1                   63a1adeff106f       kindnet-vsrrw                               kube-system
	c59aa8a0dc911       138784d87c9c5       15 minutes ago      Exited              coredns                   0                   b1743c0f4c096       coredns-66bc5c9577-slkfz                    kube-system
	ff28a2e89f65b       b1a8c6f707935       16 minutes ago      Exited              kindnet-cni               0                   63a1adeff106f       kindnet-vsrrw                               kube-system
	62155860688a4       94bff1bec29fd       16 minutes ago      Exited              kube-proxy                0                   31e96b363c821       kube-proxy-kmdsq                            kube-system
	39dbd124ad913       4f982e73e768a       16 minutes ago      Exited              kube-scheduler            0                   0fac3868adf86       kube-scheduler-functional-941011            kube-system
	8a2c1ed065d22       1b34917560f09       16 minutes ago      Exited              kube-controller-manager   0                   5f4eb46413316       kube-controller-manager-functional-941011   kube-system
	c67aa399f4b30       2c5f0dedd21c2       16 minutes ago      Exited              etcd                      0                   abc7abcd5ce10       etcd-functional-941011                      kube-system
	
	
	==> containerd <==
	Nov 24 09:07:03 functional-941011 containerd[3558]: time="2025-11-24T09:07:03.748953231Z" level=info msg="container event discarded" container=423ebfbd588615eb4bc213b5f471b5301692b734e90d6664d802a7debc9f3474 type=CONTAINER_STOPPED_EVENT
	Nov 24 09:07:05 functional-941011 containerd[3558]: time="2025-11-24T09:07:05.533170489Z" level=info msg="container event discarded" container=acb67d9acb0daf9420aa1204ef283901f23261427b26862cf7549b35a5da4d93 type=CONTAINER_STOPPED_EVENT
	Nov 24 09:07:12 functional-941011 containerd[3558]: time="2025-11-24T09:07:12.812655074Z" level=info msg="container event discarded" container=ceb13e87b0b993173e52065c7756a5d44c9ce60792725c69a0179256b703599d type=CONTAINER_CREATED_EVENT
	Nov 24 09:07:12 functional-941011 containerd[3558]: time="2025-11-24T09:07:12.812747120Z" level=info msg="container event discarded" container=ceb13e87b0b993173e52065c7756a5d44c9ce60792725c69a0179256b703599d type=CONTAINER_STARTED_EVENT
	Nov 24 09:07:12 functional-941011 containerd[3558]: time="2025-11-24T09:07:12.839271406Z" level=info msg="container event discarded" container=72303da31627e372cdd56a6534a3a836e6ac1a0ae5a307230504606fab495bc4 type=CONTAINER_CREATED_EVENT
	Nov 24 09:07:12 functional-941011 containerd[3558]: time="2025-11-24T09:07:12.839336293Z" level=info msg="container event discarded" container=72303da31627e372cdd56a6534a3a836e6ac1a0ae5a307230504606fab495bc4 type=CONTAINER_STARTED_EVENT
	Nov 24 09:07:18 functional-941011 containerd[3558]: time="2025-11-24T09:07:18.731238747Z" level=info msg="connecting to shim vfr5o31365ceazpiusjk62v70" address="unix:///run/containerd/s/6c25f01820df95851382cf9c364b63a368cd97485bb97a3f01c573ad0a81a811" namespace=k8s.io protocol=ttrpc version=3
	Nov 24 09:07:18 functional-941011 containerd[3558]: time="2025-11-24T09:07:18.824766537Z" level=info msg="shim disconnected" id=vfr5o31365ceazpiusjk62v70 namespace=k8s.io
	Nov 24 09:07:18 functional-941011 containerd[3558]: time="2025-11-24T09:07:18.824828183Z" level=info msg="cleaning up after shim disconnected" id=vfr5o31365ceazpiusjk62v70 namespace=k8s.io
	Nov 24 09:07:18 functional-941011 containerd[3558]: time="2025-11-24T09:07:18.824883511Z" level=info msg="cleaning up dead shim" id=vfr5o31365ceazpiusjk62v70 namespace=k8s.io
	Nov 24 09:07:19 functional-941011 containerd[3558]: time="2025-11-24T09:07:19.078784327Z" level=info msg="ImageCreate event name:\"localhost/my-image:functional-941011\""
	Nov 24 09:07:19 functional-941011 containerd[3558]: time="2025-11-24T09:07:19.088838090Z" level=info msg="ImageCreate event name:\"sha256:0dd6d757450a764d00308d5c72c625f64f21a54f2f9c75957a4660b66eb329a5\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:07:19 functional-941011 containerd[3558]: time="2025-11-24T09:07:19.089218418Z" level=info msg="ImageUpdate event name:\"localhost/my-image:functional-941011\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:07:20 functional-941011 containerd[3558]: time="2025-11-24T09:07:20.371626670Z" level=info msg="PullImage \"docker.io/nginx:alpine\""
	Nov 24 09:07:20 functional-941011 containerd[3558]: time="2025-11-24T09:07:20.814774327Z" level=info msg="stop pulling image docker.io/library/nginx:alpine: active requests=0, bytes read=10966"
	Nov 24 09:07:20 functional-941011 containerd[3558]: time="2025-11-24T09:07:20.815156822Z" level=error msg="PullImage \"docker.io/nginx:alpine\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"docker.io/library/nginx:alpine\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:b3c656d55d7ad751196f21b7fd2e8d4da9cb430e32f646adcf92441b72f82b14: 429 Too Many Requests\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit"
	Nov 24 09:07:36 functional-941011 containerd[3558]: time="2025-11-24T09:07:36.373500727Z" level=info msg="PullImage \"docker.io/nginx:latest\""
	Nov 24 09:07:36 functional-941011 containerd[3558]: time="2025-11-24T09:07:36.806360136Z" level=error msg="PullImage \"docker.io/nginx:latest\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"docker.io/library/nginx:latest\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:553f64aecdc31b5bf944521731cd70e35da4faed96b2b7548a3d8e2598c52a42: 429 Too Many Requests\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit"
	Nov 24 09:07:36 functional-941011 containerd[3558]: time="2025-11-24T09:07:36.806389937Z" level=info msg="stop pulling image docker.io/library/nginx:latest: active requests=0, bytes read=10967"
	Nov 24 09:07:45 functional-941011 containerd[3558]: time="2025-11-24T09:07:45.371075210Z" level=info msg="PullImage \"docker.io/kubernetesui/dashboard:v2.7.0@sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93\""
	Nov 24 09:07:45 functional-941011 containerd[3558]: time="2025-11-24T09:07:45.785214304Z" level=error msg="PullImage \"docker.io/kubernetesui/dashboard:v2.7.0@sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"docker.io/kubernetesui/dashboard@sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kubernetesui/dashboard/manifests/sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93: 429 Too Many Requests\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit"
	Nov 24 09:07:45 functional-941011 containerd[3558]: time="2025-11-24T09:07:45.785219990Z" level=info msg="stop pulling image docker.io/kubernetesui/dashboard@sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93: active requests=0, bytes read=11015"
	Nov 24 09:08:01 functional-941011 containerd[3558]: time="2025-11-24T09:08:01.371077491Z" level=info msg="PullImage \"docker.io/kubernetesui/metrics-scraper:v1.0.8@sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c\""
	Nov 24 09:08:01 functional-941011 containerd[3558]: time="2025-11-24T09:08:01.757789435Z" level=error msg="PullImage \"docker.io/kubernetesui/metrics-scraper:v1.0.8@sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"docker.io/kubernetesui/metrics-scraper@sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kubernetesui/metrics-scraper/manifests/sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c: 429 Too Many Requests\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit"
	Nov 24 09:08:01 functional-941011 containerd[3558]: time="2025-11-24T09:08:01.757792175Z" level=info msg="stop pulling image docker.io/kubernetesui/metrics-scraper@sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c: active requests=0, bytes read=11046"
	
	
	==> coredns [c59aa8a0dc911d866d1284907405697c4c3891d4b4c4423e9f764fbf37efa9c9] <==
	maxprocs: Leaving GOMAXPROCS=2: CPU quota undefined
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 9e2996f8cb67ac53e0259ab1f8d615d07d1beb0bd07e6a1e39769c3bf486a905bb991cc47f8d2f14d0d3a90a87dfc625a0b4c524fed169d8158c40657c0694b1
	CoreDNS-1.12.1
	linux/arm64, go1.24.1, 707c7c1
	[INFO] 127.0.0.1:51507 - 53970 "HINFO IN 1655404113277552318.700530042625674888. udp 56 false 512" NXDOMAIN qr,rd,ra 56 0.01547365s
	[INFO] SIGTERM: Shutting down servers then terminating
	[INFO] plugin/health: Going into lameduck mode for 5s
	
	
	==> coredns [fc6fc133ebd24da2f7fbde321f7cbdf8f57ce1d20c8e70b56a4530409fe91cf6] <==
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 9e2996f8cb67ac53e0259ab1f8d615d07d1beb0bd07e6a1e39769c3bf486a905bb991cc47f8d2f14d0d3a90a87dfc625a0b4c524fed169d8158c40657c0694b1
	CoreDNS-1.12.1
	linux/arm64, go1.24.1, 707c7c1
	[INFO] 127.0.0.1:44127 - 22213 "HINFO IN 7327540771892768101.5397131767721917287. udp 57 false 512" NXDOMAIN qr,rd,ra 57 0.043158014s
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Service: services is forbidden: User "system:serviceaccount:kube-system:coredns" cannot list resource "services" in API group "" at the cluster scope
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Namespace: namespaces is forbidden: User "system:serviceaccount:kube-system:coredns" cannot list resource "namespaces" in API group "" at the cluster scope
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	
	
	==> describe nodes <==
	Name:               functional-941011
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=arm64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=arm64
	                    kubernetes.io/hostname=functional-941011
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=393ee3e0b845623107dce6cda4f48ffd5c3d1811
	                    minikube.k8s.io/name=functional-941011
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2025_11_24T08_54_12_0700
	                    minikube.k8s.io/version=v1.37.0
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Mon, 24 Nov 2025 08:54:08 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  functional-941011
	  AcquireTime:     <unset>
	  RenewTime:       Mon, 24 Nov 2025 09:10:43 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Mon, 24 Nov 2025 09:07:51 +0000   Mon, 24 Nov 2025 08:54:06 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Mon, 24 Nov 2025 09:07:51 +0000   Mon, 24 Nov 2025 08:54:06 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Mon, 24 Nov 2025 09:07:51 +0000   Mon, 24 Nov 2025 08:54:06 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Mon, 24 Nov 2025 09:07:51 +0000   Mon, 24 Nov 2025 08:54:58 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.49.2
	  Hostname:    functional-941011
	Capacity:
	  cpu:                2
	  ephemeral-storage:  203034800Ki
	  hugepages-1Gi:      0
	  hugepages-2Mi:      0
	  hugepages-32Mi:     0
	  hugepages-64Ki:     0
	  memory:             8022296Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  203034800Ki
	  hugepages-1Gi:      0
	  hugepages-2Mi:      0
	  hugepages-32Mi:     0
	  hugepages-64Ki:     0
	  memory:             8022296Ki
	  pods:               110
	System Info:
	  Machine ID:                 7283ea1857f18f20a875c29069214c9d
	  System UUID:                d38b29f6-9a27-498f-a371-693f2677a0b6
	  Boot ID:                    e6ca431c-3a35-478f-87f6-f49cc4bc8a65
	  Kernel Version:             5.15.0-1084-aws
	  OS Image:                   Debian GNU/Linux 12 (bookworm)
	  Operating System:           linux
	  Architecture:               arm64
	  Container Runtime Version:  containerd://2.1.5
	  Kubelet Version:            v1.34.2
	  Kube-Proxy Version:         
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (14 in total)
	  Namespace                   Name                                          CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                          ------------  ----------  ---------------  -------------  ---
	  default                     hello-node-75c85bcc94-srggf                   0 (0%)        0 (0%)      0 (0%)           0 (0%)         14m
	  default                     hello-node-connect-7d85dfc575-vjgqt           0 (0%)        0 (0%)      0 (0%)           0 (0%)         10m
	  default                     nginx-svc                                     0 (0%)        0 (0%)      0 (0%)           0 (0%)         14m
	  default                     sp-pod                                        0 (0%)        0 (0%)      0 (0%)           0 (0%)         14m
	  kube-system                 coredns-66bc5c9577-slkfz                      100m (5%)     0 (0%)      70Mi (0%)        170Mi (2%)     16m
	  kube-system                 etcd-functional-941011                        100m (5%)     0 (0%)      100Mi (1%)       0 (0%)         16m
	  kube-system                 kindnet-vsrrw                                 100m (5%)     100m (5%)   50Mi (0%)        50Mi (0%)      16m
	  kube-system                 kube-apiserver-functional-941011              250m (12%)    0 (0%)      0 (0%)           0 (0%)         14m
	  kube-system                 kube-controller-manager-functional-941011     200m (10%)    0 (0%)      0 (0%)           0 (0%)         16m
	  kube-system                 kube-proxy-kmdsq                              0 (0%)        0 (0%)      0 (0%)           0 (0%)         16m
	  kube-system                 kube-scheduler-functional-941011              100m (5%)     0 (0%)      0 (0%)           0 (0%)         16m
	  kube-system                 storage-provisioner                           0 (0%)        0 (0%)      0 (0%)           0 (0%)         16m
	  kubernetes-dashboard        dashboard-metrics-scraper-77bf4d6c4c-5qb25    0 (0%)        0 (0%)      0 (0%)           0 (0%)         8m34s
	  kubernetes-dashboard        kubernetes-dashboard-855c9754f9-tww5k         0 (0%)        0 (0%)      0 (0%)           0 (0%)         8m34s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                850m (42%)  100m (5%)
	  memory             220Mi (2%)  220Mi (2%)
	  ephemeral-storage  0 (0%)      0 (0%)
	  hugepages-1Gi      0 (0%)      0 (0%)
	  hugepages-2Mi      0 (0%)      0 (0%)
	  hugepages-32Mi     0 (0%)      0 (0%)
	  hugepages-64Ki     0 (0%)      0 (0%)
	Events:
	  Type     Reason                   Age                From             Message
	  ----     ------                   ----               ----             -------
	  Normal   Starting                 16m                kube-proxy       
	  Normal   Starting                 15m                kube-proxy       
	  Normal   NodeAllocatableEnforced  16m                kubelet          Updated Node Allocatable limit across pods
	  Warning  CgroupV1                 16m                kubelet          cgroup v1 support is in maintenance mode, please migrate to cgroup v2
	  Normal   NodeHasSufficientMemory  16m                kubelet          Node functional-941011 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    16m                kubelet          Node functional-941011 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     16m                kubelet          Node functional-941011 status is now: NodeHasSufficientPID
	  Normal   Starting                 16m                kubelet          Starting kubelet.
	  Normal   RegisteredNode           16m                node-controller  Node functional-941011 event: Registered Node functional-941011 in Controller
	  Normal   NodeReady                15m                kubelet          Node functional-941011 status is now: NodeReady
	  Normal   Starting                 15m                kubelet          Starting kubelet.
	  Warning  CgroupV1                 15m                kubelet          cgroup v1 support is in maintenance mode, please migrate to cgroup v2
	  Normal   NodeHasSufficientMemory  15m (x8 over 15m)  kubelet          Node functional-941011 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    15m (x8 over 15m)  kubelet          Node functional-941011 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     15m (x7 over 15m)  kubelet          Node functional-941011 status is now: NodeHasSufficientPID
	  Normal   NodeAllocatableEnforced  15m                kubelet          Updated Node Allocatable limit across pods
	  Normal   RegisteredNode           14m                node-controller  Node functional-941011 event: Registered Node functional-941011 in Controller
	
	
	==> dmesg <==
	[Nov24 08:20] overlayfs: idmapped layers are currently not supported
	[Nov24 08:21] overlayfs: idmapped layers are currently not supported
	[Nov24 08:22] overlayfs: idmapped layers are currently not supported
	[Nov24 08:23] overlayfs: idmapped layers are currently not supported
	[Nov24 08:24] overlayfs: idmapped layers are currently not supported
	[ +19.566509] overlayfs: idmapped layers are currently not supported
	[Nov24 08:25] overlayfs: idmapped layers are currently not supported
	[ +35.934095] overlayfs: idmapped layers are currently not supported
	[Nov24 08:27] overlayfs: idmapped layers are currently not supported
	[Nov24 08:28] overlayfs: idmapped layers are currently not supported
	[Nov24 08:29] overlayfs: idmapped layers are currently not supported
	[Nov24 08:30] overlayfs: idmapped layers are currently not supported
	[Nov24 08:31] overlayfs: idmapped layers are currently not supported
	[ +44.505180] overlayfs: idmapped layers are currently not supported
	[Nov24 08:32] overlayfs: idmapped layers are currently not supported
	[Nov24 08:33] overlayfs: idmapped layers are currently not supported
	[Nov24 08:34] overlayfs: idmapped layers are currently not supported
	[ +21.137877] overlayfs: idmapped layers are currently not supported
	[ +28.724175] overlayfs: idmapped layers are currently not supported
	[Nov24 08:36] overlayfs: idmapped layers are currently not supported
	[Nov24 08:37] overlayfs: idmapped layers are currently not supported
	[Nov24 08:38] overlayfs: idmapped layers are currently not supported
	[  +4.063834] overlayfs: idmapped layers are currently not supported
	[Nov24 08:40] overlayfs: idmapped layers are currently not supported
	[Nov24 08:42] kauditd_printk_skb: 8 callbacks suppressed
	
	
	==> etcd [c67aa399f4b3099636ae5de0029e6f8778af29edd2cb80d538b3fce9f2752591] <==
	{"level":"warn","ts":"2025-11-24T08:54:08.005514Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:39950","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:54:08.022010Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:39954","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:54:08.048197Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:39980","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:54:08.061624Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:39986","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:54:08.083337Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:39998","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:54:08.099353Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40006","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:54:08.164536Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40022","server-name":"","error":"EOF"}
	{"level":"info","ts":"2025-11-24T08:55:38.679574Z","caller":"osutil/interrupt_unix.go:65","msg":"received signal; shutting down","signal":"terminated"}
	{"level":"info","ts":"2025-11-24T08:55:38.679640Z","caller":"embed/etcd.go:426","msg":"closing etcd server","name":"functional-941011","data-dir":"/var/lib/minikube/etcd","advertise-peer-urls":["https://192.168.49.2:2380"],"advertise-client-urls":["https://192.168.49.2:2379"]}
	{"level":"error","ts":"2025-11-24T08:55:38.679751Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"http: Server closed","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*serveCtx).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/serve.go:90"}
	{"level":"error","ts":"2025-11-24T08:55:38.681678Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"http: Server closed","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*serveCtx).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/serve.go:90"}
	{"level":"warn","ts":"2025-11-24T08:55:38.681817Z","caller":"embed/serve.go:245","msg":"stopping secure grpc server due to error","error":"accept tcp 127.0.0.1:2379: use of closed network connection"}
	{"level":"warn","ts":"2025-11-24T08:55:38.681861Z","caller":"embed/serve.go:247","msg":"stopped secure grpc server due to error","error":"accept tcp 127.0.0.1:2379: use of closed network connection"}
	{"level":"error","ts":"2025-11-24T08:55:38.681870Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"accept tcp 127.0.0.1:2379: use of closed network connection","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*Etcd).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:906"}
	{"level":"warn","ts":"2025-11-24T08:55:38.681935Z","caller":"embed/serve.go:245","msg":"stopping secure grpc server due to error","error":"accept tcp 192.168.49.2:2379: use of closed network connection"}
	{"level":"warn","ts":"2025-11-24T08:55:38.681953Z","caller":"embed/serve.go:247","msg":"stopped secure grpc server due to error","error":"accept tcp 192.168.49.2:2379: use of closed network connection"}
	{"level":"error","ts":"2025-11-24T08:55:38.681992Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"accept tcp 127.0.0.1:2381: use of closed network connection","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*Etcd).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:906"}
	{"level":"info","ts":"2025-11-24T08:55:38.682013Z","caller":"etcdserver/server.go:1297","msg":"skipped leadership transfer for single voting member cluster","local-member-id":"aec36adc501070cc","current-leader-member-id":"aec36adc501070cc"}
	{"level":"error","ts":"2025-11-24T08:55:38.681960Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"accept tcp 192.168.49.2:2379: use of closed network connection","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*Etcd).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:906"}
	{"level":"info","ts":"2025-11-24T08:55:38.682068Z","caller":"etcdserver/server.go:2358","msg":"server has stopped; stopping storage version's monitor"}
	{"level":"info","ts":"2025-11-24T08:55:38.682078Z","caller":"etcdserver/server.go:2335","msg":"server has stopped; stopping cluster version's monitor"}
	{"level":"info","ts":"2025-11-24T08:55:38.685220Z","caller":"embed/etcd.go:621","msg":"stopping serving peer traffic","address":"192.168.49.2:2380"}
	{"level":"error","ts":"2025-11-24T08:55:38.685308Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"accept tcp 192.168.49.2:2380: use of closed network connection","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*Etcd).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:906"}
	{"level":"info","ts":"2025-11-24T08:55:38.685331Z","caller":"embed/etcd.go:626","msg":"stopped serving peer traffic","address":"192.168.49.2:2380"}
	{"level":"info","ts":"2025-11-24T08:55:38.685338Z","caller":"embed/etcd.go:428","msg":"closed etcd server","name":"functional-941011","data-dir":"/var/lib/minikube/etcd","advertise-peer-urls":["https://192.168.49.2:2380"],"advertise-client-urls":["https://192.168.49.2:2379"]}
	
	
	==> etcd [f631f089f7db7469dd50cb6bbc0f39ec97f716fd16e1e26e288c4caf9028b204] <==
	{"level":"warn","ts":"2025-11-24T08:55:45.320828Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40364","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.336722Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40388","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.352974Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40418","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.369743Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40432","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.407571Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40450","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.417153Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40464","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.433463Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40474","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.452186Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40486","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.470636Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40502","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.487399Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40526","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.499700Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40542","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.515500Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40556","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.533300Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40572","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.550559Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40584","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.566082Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40608","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.588289Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40630","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.599854Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40638","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.614758Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40662","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.676226Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40674","server-name":"","error":"EOF"}
	{"level":"info","ts":"2025-11-24T09:05:44.443465Z","caller":"mvcc/index.go:194","msg":"compact tree index","revision":983}
	{"level":"info","ts":"2025-11-24T09:05:44.451637Z","caller":"mvcc/kvstore_compaction.go:70","msg":"finished scheduled compaction","compact-revision":983,"took":"7.878151ms","hash":1980473058,"current-db-size-bytes":3526656,"current-db-size":"3.5 MB","current-db-size-in-use-bytes":3526656,"current-db-size-in-use":"3.5 MB"}
	{"level":"info","ts":"2025-11-24T09:05:44.451689Z","caller":"mvcc/hash.go:157","msg":"storing new hash","hash":1980473058,"revision":983,"compact-revision":-1}
	{"level":"info","ts":"2025-11-24T09:10:44.450526Z","caller":"mvcc/index.go:194","msg":"compact tree index","revision":1496}
	{"level":"info","ts":"2025-11-24T09:10:44.455039Z","caller":"mvcc/kvstore_compaction.go:70","msg":"finished scheduled compaction","compact-revision":1496,"took":"4.054739ms","hash":712497044,"current-db-size-bytes":3526656,"current-db-size":"3.5 MB","current-db-size-in-use-bytes":2441216,"current-db-size-in-use":"2.4 MB"}
	{"level":"info","ts":"2025-11-24T09:10:44.455096Z","caller":"mvcc/hash.go:157","msg":"storing new hash","hash":712497044,"revision":1496,"compact-revision":983}
	
	
	==> kernel <==
	 09:10:46 up  7:52,  0 user,  load average: 0.07, 0.26, 0.89
	Linux functional-941011 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kindnet [c6aa1ae3a9097decf5035a00445a3b904d34d773c5e3cbb839ca84c4b31b2ba7] <==
	I1124 09:08:39.328636       1 main.go:301] handling current node
	I1124 09:08:49.322369       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 09:08:49.322413       1 main.go:301] handling current node
	I1124 09:08:59.322340       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 09:08:59.322378       1 main.go:301] handling current node
	I1124 09:09:09.321986       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 09:09:09.322020       1 main.go:301] handling current node
	I1124 09:09:19.322004       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 09:09:19.322065       1 main.go:301] handling current node
	I1124 09:09:29.326825       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 09:09:29.326931       1 main.go:301] handling current node
	I1124 09:09:39.323276       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 09:09:39.323313       1 main.go:301] handling current node
	I1124 09:09:49.321844       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 09:09:49.321880       1 main.go:301] handling current node
	I1124 09:09:59.321862       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 09:09:59.321898       1 main.go:301] handling current node
	I1124 09:10:09.324041       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 09:10:09.324247       1 main.go:301] handling current node
	I1124 09:10:19.321435       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 09:10:19.321469       1 main.go:301] handling current node
	I1124 09:10:29.323325       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 09:10:29.323361       1 main.go:301] handling current node
	I1124 09:10:39.322937       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 09:10:39.322970       1 main.go:301] handling current node
	
	
	==> kindnet [ff28a2e89f65b3323fbc0f41262a0b445c865c0474fbdac4ec03f66b94a6826a] <==
	I1124 08:54:17.914898       1 main.go:139] hostIP = 192.168.49.2
	podIP = 192.168.49.2
	I1124 08:54:17.915083       1 main.go:148] setting mtu 1500 for CNI 
	I1124 08:54:17.915097       1 main.go:178] kindnetd IP family: "ipv4"
	I1124 08:54:17.915113       1 main.go:182] noMask IPv4 subnets: [10.244.0.0/16]
	time="2025-11-24T08:54:18Z" level=info msg="Created plugin 10-kube-network-policies (kindnetd, handles RunPodSandbox,RemovePodSandbox)"
	I1124 08:54:18.119698       1 controller.go:377] "Starting controller" name="kube-network-policies"
	I1124 08:54:18.119719       1 controller.go:381] "Waiting for informer caches to sync"
	I1124 08:54:18.119728       1 shared_informer.go:350] "Waiting for caches to sync" controller="kube-network-policies"
	I1124 08:54:18.119873       1 controller.go:390] nri plugin exited: failed to connect to NRI service: dial unix /var/run/nri/nri.sock: connect: no such file or directory
	E1124 08:54:48.120234       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.96.0.1:443/api/v1/nodes?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: i/o timeout" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Node"
	E1124 08:54:48.120236       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: i/o timeout" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Namespace"
	E1124 08:54:48.120465       1 reflector.go:200] "Failed to watch" err="failed to list *v1.NetworkPolicy: Get \"https://10.96.0.1:443/apis/networking.k8s.io/v1/networkpolicies?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: i/o timeout" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.NetworkPolicy"
	E1124 08:54:48.120608       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Pod: Get \"https://10.96.0.1:443/api/v1/pods?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: i/o timeout" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Pod"
	I1124 08:54:49.719897       1 shared_informer.go:357] "Caches are synced" controller="kube-network-policies"
	I1124 08:54:49.719946       1 metrics.go:72] Registering metrics
	I1124 08:54:49.720169       1 controller.go:711] "Syncing nftables rules"
	I1124 08:54:58.119604       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 08:54:58.119873       1 main.go:301] handling current node
	I1124 08:55:08.122765       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 08:55:08.123311       1 main.go:301] handling current node
	I1124 08:55:18.122591       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 08:55:18.122626       1 main.go:301] handling current node
	I1124 08:55:28.122561       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 08:55:28.122829       1 main.go:301] handling current node
	
	
	==> kube-apiserver [12cb72b6be32a86f06dc22816a54fbdc70bf5efe54d3eba2e90672e835fad88f] <==
	I1124 08:55:46.624336       1 cache.go:39] Caches are synced for autoregister controller
	I1124 08:55:46.624559       1 shared_informer.go:356] "Caches are synced" controller="kubernetes-service-cidr-controller"
	I1124 08:55:46.624611       1 default_servicecidr_controller.go:137] Shutting down kubernetes-service-cidr-controller
	I1124 08:55:46.624734       1 shared_informer.go:356] "Caches are synced" controller="configmaps"
	I1124 08:55:46.637331       1 shared_informer.go:356] "Caches are synced" controller="ipallocator-repair-controller"
	I1124 08:55:46.644590       1 controller.go:667] quota admission added evaluator for: leases.coordination.k8s.io
	I1124 08:55:47.218727       1 storage_scheduling.go:111] all system priority classes are created successfully or already exist.
	I1124 08:55:47.442763       1 controller.go:667] quota admission added evaluator for: serviceaccounts
	W1124 08:55:47.629531       1 lease.go:265] Resetting endpoints for master service "kubernetes" to [192.168.49.2]
	I1124 08:55:47.631201       1 controller.go:667] quota admission added evaluator for: endpoints
	I1124 08:55:47.640973       1 controller.go:667] quota admission added evaluator for: endpointslices.discovery.k8s.io
	I1124 08:55:48.340758       1 controller.go:667] quota admission added evaluator for: daemonsets.apps
	I1124 08:55:48.559757       1 controller.go:667] quota admission added evaluator for: deployments.apps
	I1124 08:55:48.672814       1 controller.go:667] quota admission added evaluator for: roles.rbac.authorization.k8s.io
	I1124 08:55:48.686003       1 controller.go:667] quota admission added evaluator for: rolebindings.rbac.authorization.k8s.io
	I1124 08:55:57.253802       1 controller.go:667] quota admission added evaluator for: replicasets.apps
	I1124 08:56:15.389377       1 alloc.go:328] "allocated clusterIPs" service="default/invalid-svc" clusterIPs={"IPv4":"10.100.46.187"}
	E1124 08:56:19.614283       1 watch.go:272] "Unhandled Error" err="http2: stream closed" logger="UnhandledError"
	I1124 08:56:24.710981       1 alloc.go:328] "allocated clusterIPs" service="default/hello-node" clusterIPs={"IPv4":"10.110.108.17"}
	I1124 08:56:29.396574       1 alloc.go:328] "allocated clusterIPs" service="default/nginx-svc" clusterIPs={"IPv4":"10.99.196.76"}
	I1124 09:00:44.624602       1 alloc.go:328] "allocated clusterIPs" service="default/hello-node-connect" clusterIPs={"IPv4":"10.99.193.46"}
	I1124 09:02:12.158594       1 controller.go:667] quota admission added evaluator for: namespaces
	I1124 09:02:12.437241       1 alloc.go:328] "allocated clusterIPs" service="kubernetes-dashboard/kubernetes-dashboard" clusterIPs={"IPv4":"10.96.163.151"}
	I1124 09:02:12.462775       1 alloc.go:328] "allocated clusterIPs" service="kubernetes-dashboard/dashboard-metrics-scraper" clusterIPs={"IPv4":"10.102.193.246"}
	I1124 09:05:46.560487       1 cidrallocator.go:277] updated ClusterIP allocator for Service CIDR 10.96.0.0/12
	
	
	==> kube-controller-manager [8a2c1ed065d22c98f59a6c1c4a6816d5635a1993d4b79170db5bdf930e6a5464] <==
	I1124 08:54:15.869534       1 shared_informer.go:356] "Caches are synced" controller="PV protection"
	I1124 08:54:15.869543       1 shared_informer.go:356] "Caches are synced" controller="ReplicaSet"
	I1124 08:54:15.869777       1 shared_informer.go:356] "Caches are synced" controller="namespace"
	I1124 08:54:15.878398       1 range_allocator.go:428] "Set node PodCIDR" logger="node-ipam-controller" node="functional-941011" podCIDRs=["10.244.0.0/24"]
	I1124 08:54:15.882870       1 shared_informer.go:356] "Caches are synced" controller="HPA"
	I1124 08:54:15.886073       1 shared_informer.go:356] "Caches are synced" controller="garbage collector"
	I1124 08:54:15.890370       1 shared_informer.go:356] "Caches are synced" controller="PVC protection"
	I1124 08:54:15.903201       1 shared_informer.go:356] "Caches are synced" controller="garbage collector"
	I1124 08:54:15.903228       1 garbagecollector.go:154] "Garbage collector: all resource monitors have synced" logger="garbage-collector-controller"
	I1124 08:54:15.903236       1 garbagecollector.go:157] "Proceeding to collect garbage" logger="garbage-collector-controller"
	I1124 08:54:15.908644       1 shared_informer.go:356] "Caches are synced" controller="disruption"
	I1124 08:54:15.909119       1 shared_informer.go:356] "Caches are synced" controller="validatingadmissionpolicy-status"
	I1124 08:54:15.910226       1 shared_informer.go:356] "Caches are synced" controller="endpoint_slice"
	I1124 08:54:15.910325       1 shared_informer.go:356] "Caches are synced" controller="ClusterRoleAggregator"
	I1124 08:54:15.910394       1 shared_informer.go:356] "Caches are synced" controller="attach detach"
	I1124 08:54:15.910704       1 shared_informer.go:356] "Caches are synced" controller="job"
	I1124 08:54:15.910793       1 shared_informer.go:356] "Caches are synced" controller="GC"
	I1124 08:54:15.910845       1 shared_informer.go:356] "Caches are synced" controller="legacy-service-account-token-cleaner"
	I1124 08:54:15.910971       1 shared_informer.go:356] "Caches are synced" controller="VAC protection"
	I1124 08:54:15.911017       1 shared_informer.go:356] "Caches are synced" controller="taint-eviction-controller"
	I1124 08:54:15.913830       1 shared_informer.go:356] "Caches are synced" controller="endpoint_slice_mirroring"
	I1124 08:54:15.916079       1 shared_informer.go:356] "Caches are synced" controller="persistent volume"
	I1124 08:54:15.918449       1 shared_informer.go:356] "Caches are synced" controller="resource quota"
	I1124 08:54:15.919891       1 shared_informer.go:356] "Caches are synced" controller="ephemeral"
	I1124 08:55:00.864980       1 node_lifecycle_controller.go:1044] "Controller detected that some Nodes are Ready. Exiting master disruption mode" logger="node-lifecycle-controller"
	
	
	==> kube-controller-manager [bb129591935391164c1c4d497b52259d48968c471089ff98292655891beffe48] <==
	I1124 08:55:50.031540       1 shared_informer.go:356] "Caches are synced" controller="resource quota"
	I1124 08:55:50.034699       1 shared_informer.go:356] "Caches are synced" controller="certificate-csrsigning-kubelet-client"
	I1124 08:55:50.034721       1 shared_informer.go:356] "Caches are synced" controller="certificate-csrsigning-kubelet-serving"
	I1124 08:55:50.035887       1 shared_informer.go:356] "Caches are synced" controller="disruption"
	I1124 08:55:50.039119       1 shared_informer.go:356] "Caches are synced" controller="certificate-csrsigning-kube-apiserver-client"
	I1124 08:55:50.040438       1 shared_informer.go:356] "Caches are synced" controller="certificate-csrsigning-legacy-unknown"
	I1124 08:55:50.043772       1 shared_informer.go:356] "Caches are synced" controller="TTL"
	I1124 08:55:50.049118       1 shared_informer.go:356] "Caches are synced" controller="VAC protection"
	I1124 08:55:50.051401       1 shared_informer.go:356] "Caches are synced" controller="garbage collector"
	I1124 08:55:50.054626       1 shared_informer.go:356] "Caches are synced" controller="ephemeral"
	I1124 08:55:50.060982       1 shared_informer.go:356] "Caches are synced" controller="endpoint_slice_mirroring"
	I1124 08:55:50.068706       1 shared_informer.go:356] "Caches are synced" controller="TTL after finished"
	I1124 08:55:50.075379       1 shared_informer.go:356] "Caches are synced" controller="resource_claim"
	I1124 08:55:50.077489       1 shared_informer.go:356] "Caches are synced" controller="endpoint"
	I1124 08:55:50.093310       1 shared_informer.go:356] "Caches are synced" controller="garbage collector"
	I1124 08:55:50.093553       1 garbagecollector.go:154] "Garbage collector: all resource monitors have synced" logger="garbage-collector-controller"
	I1124 08:55:50.093642       1 garbagecollector.go:157] "Proceeding to collect garbage" logger="garbage-collector-controller"
	E1124 09:02:12.261542       1 replica_set.go:587] "Unhandled Error" err="sync \"kubernetes-dashboard/dashboard-metrics-scraper-77bf4d6c4c\" failed with pods \"dashboard-metrics-scraper-77bf4d6c4c-\" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount \"kubernetes-dashboard\" not found" logger="UnhandledError"
	E1124 09:02:12.275551       1 replica_set.go:587] "Unhandled Error" err="sync \"kubernetes-dashboard/dashboard-metrics-scraper-77bf4d6c4c\" failed with pods \"dashboard-metrics-scraper-77bf4d6c4c-\" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount \"kubernetes-dashboard\" not found" logger="UnhandledError"
	E1124 09:02:12.284944       1 replica_set.go:587] "Unhandled Error" err="sync \"kubernetes-dashboard/kubernetes-dashboard-855c9754f9\" failed with pods \"kubernetes-dashboard-855c9754f9-\" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount \"kubernetes-dashboard\" not found" logger="UnhandledError"
	E1124 09:02:12.286796       1 replica_set.go:587] "Unhandled Error" err="sync \"kubernetes-dashboard/dashboard-metrics-scraper-77bf4d6c4c\" failed with pods \"dashboard-metrics-scraper-77bf4d6c4c-\" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount \"kubernetes-dashboard\" not found" logger="UnhandledError"
	E1124 09:02:12.293749       1 replica_set.go:587] "Unhandled Error" err="sync \"kubernetes-dashboard/kubernetes-dashboard-855c9754f9\" failed with pods \"kubernetes-dashboard-855c9754f9-\" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount \"kubernetes-dashboard\" not found" logger="UnhandledError"
	E1124 09:02:12.297333       1 replica_set.go:587] "Unhandled Error" err="sync \"kubernetes-dashboard/dashboard-metrics-scraper-77bf4d6c4c\" failed with pods \"dashboard-metrics-scraper-77bf4d6c4c-\" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount \"kubernetes-dashboard\" not found" logger="UnhandledError"
	E1124 09:02:12.304126       1 replica_set.go:587] "Unhandled Error" err="sync \"kubernetes-dashboard/kubernetes-dashboard-855c9754f9\" failed with pods \"kubernetes-dashboard-855c9754f9-\" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount \"kubernetes-dashboard\" not found" logger="UnhandledError"
	E1124 09:02:12.311952       1 replica_set.go:587] "Unhandled Error" err="sync \"kubernetes-dashboard/kubernetes-dashboard-855c9754f9\" failed with pods \"kubernetes-dashboard-855c9754f9-\" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount \"kubernetes-dashboard\" not found" logger="UnhandledError"
	
	
	==> kube-proxy [62155860688a446bc4f0edf281779e28f4ecf113640dbf9995722ce6bfe4cd55] <==
	I1124 08:54:17.565304       1 server_linux.go:53] "Using iptables proxy"
	I1124 08:54:17.660540       1 shared_informer.go:349] "Waiting for caches to sync" controller="node informer cache"
	I1124 08:54:17.760879       1 shared_informer.go:356] "Caches are synced" controller="node informer cache"
	I1124 08:54:17.760920       1 server.go:219] "Successfully retrieved NodeIPs" NodeIPs=["192.168.49.2"]
	E1124 08:54:17.761072       1 server.go:256] "Kube-proxy configuration may be incomplete or incorrect" err="nodePortAddresses is unset; NodePort connections will be accepted on all local IPs. Consider using `--nodeport-addresses primary`"
	I1124 08:54:17.852239       1 server.go:265] "kube-proxy running in dual-stack mode" primary ipFamily="IPv4"
	I1124 08:54:17.852307       1 server_linux.go:132] "Using iptables Proxier"
	I1124 08:54:17.866599       1 proxier.go:242] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses" ipFamily="IPv4"
	I1124 08:54:17.866936       1 server.go:527] "Version info" version="v1.34.2"
	I1124 08:54:17.866952       1 server.go:529] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I1124 08:54:17.868618       1 config.go:200] "Starting service config controller"
	I1124 08:54:17.868629       1 shared_informer.go:349] "Waiting for caches to sync" controller="service config"
	I1124 08:54:17.868646       1 config.go:106] "Starting endpoint slice config controller"
	I1124 08:54:17.868649       1 shared_informer.go:349] "Waiting for caches to sync" controller="endpoint slice config"
	I1124 08:54:17.868660       1 config.go:403] "Starting serviceCIDR config controller"
	I1124 08:54:17.868666       1 shared_informer.go:349] "Waiting for caches to sync" controller="serviceCIDR config"
	I1124 08:54:17.876964       1 config.go:309] "Starting node config controller"
	I1124 08:54:17.876996       1 shared_informer.go:349] "Waiting for caches to sync" controller="node config"
	I1124 08:54:17.877013       1 shared_informer.go:356] "Caches are synced" controller="node config"
	I1124 08:54:17.969733       1 shared_informer.go:356] "Caches are synced" controller="service config"
	I1124 08:54:17.969806       1 shared_informer.go:356] "Caches are synced" controller="endpoint slice config"
	I1124 08:54:17.972675       1 shared_informer.go:356] "Caches are synced" controller="serviceCIDR config"
	
	
	==> kube-proxy [73cbc10088baee047f5af65dd9d3da14101d5c79193d1084620f4ef78567007c] <==
	I1124 08:55:29.304214       1 shared_informer.go:349] "Waiting for caches to sync" controller="node informer cache"
	E1124 08:55:29.305266       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-941011&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1124 08:55:30.871934       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-941011&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1124 08:55:32.698486       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-941011&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1124 08:55:38.932652       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-941011&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	I1124 08:55:46.604339       1 shared_informer.go:356] "Caches are synced" controller="node informer cache"
	I1124 08:55:46.604382       1 server.go:219] "Successfully retrieved NodeIPs" NodeIPs=["192.168.49.2"]
	E1124 08:55:46.604726       1 server.go:256] "Kube-proxy configuration may be incomplete or incorrect" err="nodePortAddresses is unset; NodePort connections will be accepted on all local IPs. Consider using `--nodeport-addresses primary`"
	I1124 08:55:46.633087       1 server.go:265] "kube-proxy running in dual-stack mode" primary ipFamily="IPv4"
	I1124 08:55:46.633170       1 server_linux.go:132] "Using iptables Proxier"
	I1124 08:55:46.637676       1 proxier.go:242] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses" ipFamily="IPv4"
	I1124 08:55:46.638100       1 server.go:527] "Version info" version="v1.34.2"
	I1124 08:55:46.638127       1 server.go:529] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I1124 08:55:46.640035       1 config.go:200] "Starting service config controller"
	I1124 08:55:46.640207       1 shared_informer.go:349] "Waiting for caches to sync" controller="service config"
	I1124 08:55:46.640350       1 config.go:106] "Starting endpoint slice config controller"
	I1124 08:55:46.640418       1 shared_informer.go:349] "Waiting for caches to sync" controller="endpoint slice config"
	I1124 08:55:46.640509       1 config.go:403] "Starting serviceCIDR config controller"
	I1124 08:55:46.640794       1 shared_informer.go:349] "Waiting for caches to sync" controller="serviceCIDR config"
	I1124 08:55:46.647186       1 config.go:309] "Starting node config controller"
	I1124 08:55:46.647270       1 shared_informer.go:349] "Waiting for caches to sync" controller="node config"
	I1124 08:55:46.647299       1 shared_informer.go:356] "Caches are synced" controller="node config"
	I1124 08:55:46.741106       1 shared_informer.go:356] "Caches are synced" controller="serviceCIDR config"
	I1124 08:55:46.741112       1 shared_informer.go:356] "Caches are synced" controller="service config"
	I1124 08:55:46.741150       1 shared_informer.go:356] "Caches are synced" controller="endpoint slice config"
	
	
	==> kube-scheduler [39dbd124ad9131afd2e38d1cc6019bc4e02106f43653bf39b293c3e257811124] <==
	E1124 08:54:08.898071       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ResourceClaim: resourceclaims.resource.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"resourceclaims\" in API group \"resource.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ResourceClaim"
	E1124 08:54:08.898417       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:kube-scheduler\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service"
	E1124 08:54:08.898585       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User \"system:kube-scheduler\" cannot list resource \"persistentvolumes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolume"
	E1124 08:54:08.898858       1 reflector.go:205] "Failed to watch" err="failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User \"system:kube-scheduler\" cannot list resource \"statefulsets\" in API group \"apps\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.StatefulSet"
	E1124 08:54:08.898950       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver"
	E1124 08:54:08.899058       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User \"system:kube-scheduler\" cannot list resource \"replicationcontrollers\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ReplicationController"
	E1124 08:54:08.899111       1 reflector.go:205] "Failed to watch" err="failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"storageclasses\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.StorageClass"
	E1124 08:54:09.739885       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Namespace: namespaces is forbidden: User \"system:kube-scheduler\" cannot list resource \"namespaces\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Namespace"
	E1124 08:54:09.749915       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User \"system:kube-scheduler\" cannot list resource \"replicasets\" in API group \"apps\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ReplicaSet"
	E1124 08:54:09.907520       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User \"system:kube-scheduler\" cannot list resource \"persistentvolumes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolume"
	E1124 08:54:09.970254       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User \"system:kube-scheduler\" cannot list resource \"persistentvolumeclaims\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolumeClaim"
	E1124 08:54:10.012374       1 reflector.go:205] "Failed to watch" err="failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"storageclasses\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.StorageClass"
	E1124 08:54:10.028349       1 reflector.go:205] "Failed to watch" err="failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User \"system:kube-scheduler\" cannot list resource \"statefulsets\" in API group \"apps\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.StatefulSet"
	E1124 08:54:10.089620       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Pod: pods is forbidden: User \"system:kube-scheduler\" cannot list resource \"pods\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Pod"
	E1124 08:54:10.145424       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver"
	E1124 08:54:10.145452       1 reflector.go:205] "Failed to watch" err="failed to list *v1.DeviceClass: deviceclasses.resource.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"deviceclasses\" in API group \"resource.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.DeviceClass"
	E1124 08:54:10.166448       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User \"system:kube-scheduler\" cannot list resource \"replicationcontrollers\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ReplicationController"
	E1124 08:54:10.168975       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:kube-scheduler\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service"
	E1124 08:54:10.472917       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"extension-apiserver-authentication\" is forbidden: User \"system:kube-scheduler\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"kube-system\"" logger="UnhandledError" reflector="runtime/asm_arm64.s:1223" type="*v1.ConfigMap"
	I1124 08:54:12.282734       1 shared_informer.go:356] "Caches are synced" controller="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I1124 08:55:28.427975       1 secure_serving.go:259] Stopped listening on 127.0.0.1:10259
	I1124 08:55:28.428004       1 server.go:263] "[graceful-termination] secure server has stopped listening"
	I1124 08:55:28.428018       1 configmap_cafile_content.go:226] "Shutting down controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I1124 08:55:28.428088       1 server.go:265] "[graceful-termination] secure server is exiting"
	E1124 08:55:28.428102       1 run.go:72] "command failed" err="finished without leader elect"
	
	
	==> kube-scheduler [4db14bc0f3747bb08ccdf3a4232220b42a95e683884234f9fff57a09e95b88c3] <==
	E1124 08:55:36.598347       1 reflector.go:205] "Failed to watch" err="failed to list *v1.VolumeAttachment: Get \"https://192.168.49.2:8441/apis/storage.k8s.io/v1/volumeattachments?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.VolumeAttachment"
	E1124 08:55:36.838482       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PodDisruptionBudget: Get \"https://192.168.49.2:8441/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PodDisruptionBudget"
	E1124 08:55:37.242275       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://192.168.49.2:8441/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Namespace"
	E1124 08:55:37.413389       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://192.168.49.2:8441/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver"
	E1124 08:55:37.515451       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://192.168.49.2:8441/api/v1/services?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service"
	E1124 08:55:37.711691       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Pod: Get \"https://192.168.49.2:8441/api/v1/pods?fieldSelector=status.phase%21%3DSucceeded%2Cstatus.phase%21%3DFailed&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Pod"
	E1124 08:55:38.082012       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ConfigMap: Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/configmaps?fieldSelector=metadata.name%3Dextension-apiserver-authentication&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="runtime/asm_arm64.s:1223" type="*v1.ConfigMap"
	E1124 08:55:38.154030       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ResourceClaim: Get \"https://192.168.49.2:8441/apis/resource.k8s.io/v1/resourceclaims?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ResourceClaim"
	E1124 08:55:38.280385       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ResourceSlice: Get \"https://192.168.49.2:8441/apis/resource.k8s.io/v1/resourceslices?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ResourceSlice"
	E1124 08:55:38.363163       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSINode: Get \"https://192.168.49.2:8441/apis/storage.k8s.io/v1/csinodes?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSINode"
	E1124 08:55:38.536440       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://192.168.49.2:8441/api/v1/nodes?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1124 08:55:38.785077       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIStorageCapacity: Get \"https://192.168.49.2:8441/apis/storage.k8s.io/v1/csistoragecapacities?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIStorageCapacity"
	E1124 08:55:38.930056       1 reflector.go:205] "Failed to watch" err="failed to list *v1.DeviceClass: Get \"https://192.168.49.2:8441/apis/resource.k8s.io/v1/deviceclasses?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.DeviceClass"
	E1124 08:55:38.985957       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolumeClaim: Get \"https://192.168.49.2:8441/api/v1/persistentvolumeclaims?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolumeClaim"
	E1124 08:55:39.039078       1 reflector.go:205] "Failed to watch" err="failed to list *v1.StatefulSet: Get \"https://192.168.49.2:8441/apis/apps/v1/statefulsets?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.StatefulSet"
	E1124 08:55:39.348953       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ReplicaSet: Get \"https://192.168.49.2:8441/apis/apps/v1/replicasets?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ReplicaSet"
	E1124 08:55:39.714746       1 reflector.go:205] "Failed to watch" err="failed to list *v1.StorageClass: Get \"https://192.168.49.2:8441/apis/storage.k8s.io/v1/storageclasses?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.StorageClass"
	E1124 08:55:39.840410       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolume: Get \"https://192.168.49.2:8441/api/v1/persistentvolumes?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolume"
	E1124 08:55:46.556944       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ResourceClaim: resourceclaims.resource.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"resourceclaims\" in API group \"resource.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ResourceClaim"
	E1124 08:55:46.557177       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User \"system:kube-scheduler\" cannot list resource \"replicationcontrollers\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ReplicationController"
	E1124 08:55:46.557352       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Pod: pods is forbidden: User \"system:kube-scheduler\" cannot list resource \"pods\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Pod"
	E1124 08:55:46.557551       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PodDisruptionBudget: poddisruptionbudgets.policy is forbidden: User \"system:kube-scheduler\" cannot list resource \"poddisruptionbudgets\" in API group \"policy\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PodDisruptionBudget"
	E1124 08:55:46.557775       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver"
	E1124 08:55:46.558034       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIStorageCapacity: csistoragecapacities.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csistoragecapacities\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIStorageCapacity"
	I1124 08:55:47.832870       1 shared_informer.go:356] "Caches are synced" controller="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	
	
	==> kubelet <==
	Nov 24 09:09:43 functional-941011 kubelet[4469]: E1124 09:09:43.370491    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubernetes-dashboard\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/kubernetesui/dashboard:v2.7.0@sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/kubernetesui/dashboard@sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kubernetesui/dashboard/manifests/sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="kubernetes-dashboard/kubernetes-dashboard-855c9754f9-tww5k" podUID="62e0e2ab-361c-43b1-b518-50508c7157ce"
	Nov 24 09:09:45 functional-941011 kubelet[4469]: E1124 09:09:45.371080    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nginx\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/nginx:alpine\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/library/nginx:alpine\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:b3c656d55d7ad751196f21b7fd2e8d4da9cb430e32f646adcf92441b72f82b14: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/nginx-svc" podUID="d266cb95-cc24-4a91-a764-df9ddabaf208"
	Nov 24 09:09:49 functional-941011 kubelet[4469]: E1124 09:09:49.371176    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dashboard-metrics-scraper\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/kubernetesui/metrics-scraper:v1.0.8@sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/kubernetesui/metrics-scraper@sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kubernetesui/metrics-scraper/manifests/sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="kubernetes-dashboard/dashboard-metrics-scraper-77bf4d6c4c-5qb25" podUID="cd7638bf-35f8-43ed-a194-705cf
32ed1fc"
	Nov 24 09:09:52 functional-941011 kubelet[4469]: E1124 09:09:52.370312    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"myfrontend\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/nginx\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/library/nginx:latest\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:553f64aecdc31b5bf944521731cd70e35da4faed96b2b7548a3d8e2598c52a42: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/sp-pod" podUID="8483abe7-6e51-4bf9-9b52-141abe46cd3e"
	Nov 24 09:09:54 functional-941011 kubelet[4469]: E1124 09:09:54.370849    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"echo-server\" with ImagePullBackOff: \"Back-off pulling image \\\"kicbase/echo-server\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/kicbase/echo-server:latest\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kicbase/echo-server/manifests/sha256:42a89d9b22e5307cb88494990d5d929c401339f508c0a7e98a4d8ac52623fc5b: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/hello-node-connect-7d85dfc575-vjgqt" podUID="672307bf-173d-40e2-b7c4-0b44c2cea44f"
	Nov 24 09:09:54 functional-941011 kubelet[4469]: E1124 09:09:54.371627    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubernetes-dashboard\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/kubernetesui/dashboard:v2.7.0@sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/kubernetesui/dashboard@sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kubernetesui/dashboard/manifests/sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="kubernetes-dashboard/kubernetes-dashboard-855c9754f9-tww5k" podUID="62e0e2ab-361c-43b1-b518-50508c7157ce"
	Nov 24 09:09:56 functional-941011 kubelet[4469]: E1124 09:09:56.370930    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nginx\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/nginx:alpine\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/library/nginx:alpine\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:b3c656d55d7ad751196f21b7fd2e8d4da9cb430e32f646adcf92441b72f82b14: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/nginx-svc" podUID="d266cb95-cc24-4a91-a764-df9ddabaf208"
	Nov 24 09:10:02 functional-941011 kubelet[4469]: E1124 09:10:02.374089    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dashboard-metrics-scraper\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/kubernetesui/metrics-scraper:v1.0.8@sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/kubernetesui/metrics-scraper@sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kubernetesui/metrics-scraper/manifests/sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="kubernetes-dashboard/dashboard-metrics-scraper-77bf4d6c4c-5qb25" podUID="cd7638bf-35f8-43ed-a194-705cf
32ed1fc"
	Nov 24 09:10:05 functional-941011 kubelet[4469]: E1124 09:10:05.370761    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"echo-server\" with ImagePullBackOff: \"Back-off pulling image \\\"kicbase/echo-server\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/kicbase/echo-server:latest\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kicbase/echo-server/manifests/sha256:42a89d9b22e5307cb88494990d5d929c401339f508c0a7e98a4d8ac52623fc5b: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/hello-node-connect-7d85dfc575-vjgqt" podUID="672307bf-173d-40e2-b7c4-0b44c2cea44f"
	Nov 24 09:10:06 functional-941011 kubelet[4469]: E1124 09:10:06.372702    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"myfrontend\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/nginx\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/library/nginx:latest\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:553f64aecdc31b5bf944521731cd70e35da4faed96b2b7548a3d8e2598c52a42: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/sp-pod" podUID="8483abe7-6e51-4bf9-9b52-141abe46cd3e"
	Nov 24 09:10:08 functional-941011 kubelet[4469]: E1124 09:10:08.370787    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubernetes-dashboard\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/kubernetesui/dashboard:v2.7.0@sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/kubernetesui/dashboard@sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kubernetesui/dashboard/manifests/sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="kubernetes-dashboard/kubernetes-dashboard-855c9754f9-tww5k" podUID="62e0e2ab-361c-43b1-b518-50508c7157ce"
	Nov 24 09:10:10 functional-941011 kubelet[4469]: E1124 09:10:10.370576    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nginx\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/nginx:alpine\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/library/nginx:alpine\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:b3c656d55d7ad751196f21b7fd2e8d4da9cb430e32f646adcf92441b72f82b14: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/nginx-svc" podUID="d266cb95-cc24-4a91-a764-df9ddabaf208"
	Nov 24 09:10:16 functional-941011 kubelet[4469]: E1124 09:10:16.370790    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dashboard-metrics-scraper\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/kubernetesui/metrics-scraper:v1.0.8@sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/kubernetesui/metrics-scraper@sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kubernetesui/metrics-scraper/manifests/sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="kubernetes-dashboard/dashboard-metrics-scraper-77bf4d6c4c-5qb25" podUID="cd7638bf-35f8-43ed-a194-705cf
32ed1fc"
	Nov 24 09:10:19 functional-941011 kubelet[4469]: E1124 09:10:19.370180    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"echo-server\" with ImagePullBackOff: \"Back-off pulling image \\\"kicbase/echo-server\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/kicbase/echo-server:latest\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kicbase/echo-server/manifests/sha256:42a89d9b22e5307cb88494990d5d929c401339f508c0a7e98a4d8ac52623fc5b: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/hello-node-connect-7d85dfc575-vjgqt" podUID="672307bf-173d-40e2-b7c4-0b44c2cea44f"
	Nov 24 09:10:20 functional-941011 kubelet[4469]: E1124 09:10:20.369705    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"myfrontend\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/nginx\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/library/nginx:latest\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:553f64aecdc31b5bf944521731cd70e35da4faed96b2b7548a3d8e2598c52a42: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/sp-pod" podUID="8483abe7-6e51-4bf9-9b52-141abe46cd3e"
	Nov 24 09:10:21 functional-941011 kubelet[4469]: E1124 09:10:21.370818    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubernetes-dashboard\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/kubernetesui/dashboard:v2.7.0@sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/kubernetesui/dashboard@sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kubernetesui/dashboard/manifests/sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="kubernetes-dashboard/kubernetes-dashboard-855c9754f9-tww5k" podUID="62e0e2ab-361c-43b1-b518-50508c7157ce"
	Nov 24 09:10:23 functional-941011 kubelet[4469]: E1124 09:10:23.370926    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nginx\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/nginx:alpine\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/library/nginx:alpine\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:b3c656d55d7ad751196f21b7fd2e8d4da9cb430e32f646adcf92441b72f82b14: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/nginx-svc" podUID="d266cb95-cc24-4a91-a764-df9ddabaf208"
	Nov 24 09:10:31 functional-941011 kubelet[4469]: E1124 09:10:31.370336    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dashboard-metrics-scraper\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/kubernetesui/metrics-scraper:v1.0.8@sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/kubernetesui/metrics-scraper@sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kubernetesui/metrics-scraper/manifests/sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="kubernetes-dashboard/dashboard-metrics-scraper-77bf4d6c4c-5qb25" podUID="cd7638bf-35f8-43ed-a194-705cf
32ed1fc"
	Nov 24 09:10:33 functional-941011 kubelet[4469]: E1124 09:10:33.371457    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"echo-server\" with ImagePullBackOff: \"Back-off pulling image \\\"kicbase/echo-server\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/kicbase/echo-server:latest\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kicbase/echo-server/manifests/sha256:42a89d9b22e5307cb88494990d5d929c401339f508c0a7e98a4d8ac52623fc5b: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/hello-node-connect-7d85dfc575-vjgqt" podUID="672307bf-173d-40e2-b7c4-0b44c2cea44f"
	Nov 24 09:10:33 functional-941011 kubelet[4469]: E1124 09:10:33.372881    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"myfrontend\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/nginx\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/library/nginx:latest\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:553f64aecdc31b5bf944521731cd70e35da4faed96b2b7548a3d8e2598c52a42: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/sp-pod" podUID="8483abe7-6e51-4bf9-9b52-141abe46cd3e"
	Nov 24 09:10:35 functional-941011 kubelet[4469]: E1124 09:10:35.371823    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubernetes-dashboard\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/kubernetesui/dashboard:v2.7.0@sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/kubernetesui/dashboard@sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kubernetesui/dashboard/manifests/sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="kubernetes-dashboard/kubernetes-dashboard-855c9754f9-tww5k" podUID="62e0e2ab-361c-43b1-b518-50508c7157ce"
	Nov 24 09:10:36 functional-941011 kubelet[4469]: E1124 09:10:36.370833    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nginx\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/nginx:alpine\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/library/nginx:alpine\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:b3c656d55d7ad751196f21b7fd2e8d4da9cb430e32f646adcf92441b72f82b14: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/nginx-svc" podUID="d266cb95-cc24-4a91-a764-df9ddabaf208"
	Nov 24 09:10:44 functional-941011 kubelet[4469]: E1124 09:10:44.371174    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dashboard-metrics-scraper\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/kubernetesui/metrics-scraper:v1.0.8@sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/kubernetesui/metrics-scraper@sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kubernetesui/metrics-scraper/manifests/sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="kubernetes-dashboard/dashboard-metrics-scraper-77bf4d6c4c-5qb25" podUID="cd7638bf-35f8-43ed-a194-705cf
32ed1fc"
	Nov 24 09:10:46 functional-941011 kubelet[4469]: E1124 09:10:46.370249    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"echo-server\" with ImagePullBackOff: \"Back-off pulling image \\\"kicbase/echo-server\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/kicbase/echo-server:latest\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kicbase/echo-server/manifests/sha256:42a89d9b22e5307cb88494990d5d929c401339f508c0a7e98a4d8ac52623fc5b: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/hello-node-connect-7d85dfc575-vjgqt" podUID="672307bf-173d-40e2-b7c4-0b44c2cea44f"
	Nov 24 09:10:46 functional-941011 kubelet[4469]: E1124 09:10:46.373676    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"myfrontend\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/nginx\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/library/nginx:latest\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:553f64aecdc31b5bf944521731cd70e35da4faed96b2b7548a3d8e2598c52a42: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/sp-pod" podUID="8483abe7-6e51-4bf9-9b52-141abe46cd3e"
	
	
	==> storage-provisioner [b6a84160fb5a462413dc19bd1857be8f2613f26a584ad28f2b8052ac97712585] <==
	I1124 08:55:29.097755       1 storage_provisioner.go:116] Initializing the minikube storage provisioner...
	F1124 08:55:29.099748       1 main.go:39] error getting server version: Get "https://10.96.0.1:443/version?timeout=32s": dial tcp 10.96.0.1:443: connect: connection refused
	
	
	==> storage-provisioner [e068a2929b9a7acaf417417b25ba4835b5824322af5bcbddaa3f41f7d5cb8575] <==
	W1124 09:10:23.329998       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:10:25.333466       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:10:25.337802       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:10:27.340850       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:10:27.345386       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:10:29.348159       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:10:29.352811       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:10:31.355987       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:10:31.360336       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:10:33.363349       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:10:33.370704       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:10:35.373492       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:10:35.378418       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:10:37.381044       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:10:37.387748       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:10:39.390587       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:10:39.395075       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:10:41.398528       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:10:41.403151       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:10:43.406388       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:10:43.411190       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:10:45.415090       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:10:45.421317       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:10:47.426221       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:10:47.431295       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-941011 -n functional-941011
helpers_test.go:269: (dbg) Run:  kubectl --context functional-941011 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:280: non-running pods: busybox-mount hello-node-connect-7d85dfc575-vjgqt nginx-svc sp-pod dashboard-metrics-scraper-77bf4d6c4c-5qb25 kubernetes-dashboard-855c9754f9-tww5k
helpers_test.go:282: ======> post-mortem[TestFunctional/parallel/ServiceCmdConnect]: describe non-running pods <======
helpers_test.go:285: (dbg) Run:  kubectl --context functional-941011 describe pod busybox-mount hello-node-connect-7d85dfc575-vjgqt nginx-svc sp-pod dashboard-metrics-scraper-77bf4d6c4c-5qb25 kubernetes-dashboard-855c9754f9-tww5k
helpers_test.go:285: (dbg) Non-zero exit: kubectl --context functional-941011 describe pod busybox-mount hello-node-connect-7d85dfc575-vjgqt nginx-svc sp-pod dashboard-metrics-scraper-77bf4d6c4c-5qb25 kubernetes-dashboard-855c9754f9-tww5k: exit status 1 (130.241092ms)

                                                
                                                
-- stdout --
	Name:             busybox-mount
	Namespace:        default
	Priority:         0
	Service Account:  default
	Node:             functional-941011/192.168.49.2
	Start Time:       Mon, 24 Nov 2025 09:02:01 +0000
	Labels:           integration-test=busybox-mount
	Annotations:      <none>
	Status:           Succeeded
	IP:               10.244.0.8
	IPs:
	  IP:  10.244.0.8
	Containers:
	  mount-munger:
	    Container ID:  containerd://423ebfbd588615eb4bc213b5f471b5301692b734e90d6664d802a7debc9f3474
	    Image:         gcr.io/k8s-minikube/busybox:1.28.4-glibc
	    Image ID:      gcr.io/k8s-minikube/busybox@sha256:2d03e6ceeb99250061dd110530b0ece7998cd84121f952adef120ea7c5a6f00e
	    Port:          <none>
	    Host Port:     <none>
	    Command:
	      /bin/sh
	      -c
	      --
	    Args:
	      cat /mount-9p/created-by-test; echo test > /mount-9p/created-by-pod; rm /mount-9p/created-by-test-removed-by-pod; echo test > /mount-9p/created-by-pod-removed-by-test date >> /mount-9p/pod-dates
	    State:          Terminated
	      Reason:       Completed
	      Exit Code:    0
	      Started:      Mon, 24 Nov 2025 09:02:03 +0000
	      Finished:     Mon, 24 Nov 2025 09:02:03 +0000
	    Ready:          False
	    Restart Count:  0
	    Environment:    <none>
	    Mounts:
	      /mount-9p from test-volume (rw)
	      /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-6f7z2 (ro)
	Conditions:
	  Type                        Status
	  PodReadyToStartContainers   False 
	  Initialized                 True 
	  Ready                       False 
	  ContainersReady             False 
	  PodScheduled                True 
	Volumes:
	  test-volume:
	    Type:          HostPath (bare host directory volume)
	    Path:          /mount-9p
	    HostPathType:  
	  kube-api-access-6f7z2:
	    Type:                    Projected (a volume that contains injected data from multiple sources)
	    TokenExpirationSeconds:  3607
	    ConfigMapName:           kube-root-ca.crt
	    Optional:                false
	    DownwardAPI:             true
	QoS Class:                   BestEffort
	Node-Selectors:              <none>
	Tolerations:                 node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
	                             node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
	Events:
	  Type    Reason     Age    From               Message
	  ----    ------     ----   ----               -------
	  Normal  Scheduled  8m47s  default-scheduler  Successfully assigned default/busybox-mount to functional-941011
	  Normal  Pulling    8m47s  kubelet            Pulling image "gcr.io/k8s-minikube/busybox:1.28.4-glibc"
	  Normal  Pulled     8m45s  kubelet            Successfully pulled image "gcr.io/k8s-minikube/busybox:1.28.4-glibc" in 2.061s (2.062s including waiting). Image size: 1935750 bytes.
	  Normal  Created    8m45s  kubelet            Created container: mount-munger
	  Normal  Started    8m45s  kubelet            Started container mount-munger
	
	
	Name:             hello-node-connect-7d85dfc575-vjgqt
	Namespace:        default
	Priority:         0
	Service Account:  default
	Node:             functional-941011/192.168.49.2
	Start Time:       Mon, 24 Nov 2025 09:00:44 +0000
	Labels:           app=hello-node-connect
	                  pod-template-hash=7d85dfc575
	Annotations:      <none>
	Status:           Pending
	IP:               10.244.0.7
	IPs:
	  IP:           10.244.0.7
	Controlled By:  ReplicaSet/hello-node-connect-7d85dfc575
	Containers:
	  echo-server:
	    Container ID:   
	    Image:          kicbase/echo-server
	    Image ID:       
	    Port:           <none>
	    Host Port:      <none>
	    State:          Waiting
	      Reason:       ImagePullBackOff
	    Ready:          False
	    Restart Count:  0
	    Environment:    <none>
	    Mounts:
	      /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-8n6nc (ro)
	Conditions:
	  Type                        Status
	  PodReadyToStartContainers   True 
	  Initialized                 True 
	  Ready                       False 
	  ContainersReady             False 
	  PodScheduled                True 
	Volumes:
	  kube-api-access-8n6nc:
	    Type:                    Projected (a volume that contains injected data from multiple sources)
	    TokenExpirationSeconds:  3607
	    ConfigMapName:           kube-root-ca.crt
	    Optional:                false
	    DownwardAPI:             true
	QoS Class:                   BestEffort
	Node-Selectors:              <none>
	Tolerations:                 node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
	                             node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
	Events:
	  Type     Reason     Age   From               Message
	  ----     ------     ----  ----               -------
	  Normal   Scheduled  10m   default-scheduler  Successfully assigned default/hello-node-connect-7d85dfc575-vjgqt to functional-941011
	  Warning  Failed     10m   kubelet            Failed to pull image "kicbase/echo-server": failed to pull and unpack image "docker.io/kicbase/echo-server:latest": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kicbase/echo-server/manifests/sha256:42a89d9b22e5307cb88494990d5d929c401339f508c0a7e98a4d8ac52623fc5b: 429 Too Many Requests
	toomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit
	  Normal   Pulling  7m1s (x5 over 10m)    kubelet  Pulling image "kicbase/echo-server"
	  Warning  Failed   7m1s (x5 over 10m)    kubelet  Error: ErrImagePull
	  Warning  Failed   7m1s (x4 over 9m49s)  kubelet  Failed to pull image "kicbase/echo-server": failed to pull and unpack image "docker.io/kicbase/echo-server:latest": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/kicbase/echo-server/manifests/sha256:127ac38a2bb9537b7f252addff209ea6801edcac8a92c8b1104dacd66a583ed6: 429 Too Many Requests
	toomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit
	  Normal   BackOff  2s (x43 over 10m)  kubelet  Back-off pulling image "kicbase/echo-server"
	  Warning  Failed   2s (x43 over 10m)  kubelet  Error: ImagePullBackOff
	
	
	Name:             nginx-svc
	Namespace:        default
	Priority:         0
	Service Account:  default
	Node:             functional-941011/192.168.49.2
	Start Time:       Mon, 24 Nov 2025 08:56:29 +0000
	Labels:           run=nginx-svc
	Annotations:      <none>
	Status:           Pending
	IP:               10.244.0.5
	IPs:
	  IP:  10.244.0.5
	Containers:
	  nginx:
	    Container ID:   
	    Image:          docker.io/nginx:alpine
	    Image ID:       
	    Port:           80/TCP
	    Host Port:      0/TCP
	    State:          Waiting
	      Reason:       ImagePullBackOff
	    Ready:          False
	    Restart Count:  0
	    Environment:    <none>
	    Mounts:
	      /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-qt757 (ro)
	Conditions:
	  Type                        Status
	  PodReadyToStartContainers   True 
	  Initialized                 True 
	  Ready                       False 
	  ContainersReady             False 
	  PodScheduled                True 
	Volumes:
	  kube-api-access-qt757:
	    Type:                    Projected (a volume that contains injected data from multiple sources)
	    TokenExpirationSeconds:  3607
	    ConfigMapName:           kube-root-ca.crt
	    Optional:                false
	    DownwardAPI:             true
	QoS Class:                   BestEffort
	Node-Selectors:              <none>
	Tolerations:                 node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
	                             node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
	Events:
	  Type     Reason     Age                From               Message
	  ----     ------     ----               ----               -------
	  Normal   Scheduled  14m                default-scheduler  Successfully assigned default/nginx-svc to functional-941011
	  Warning  Failed     12m (x3 over 14m)  kubelet            Failed to pull image "docker.io/nginx:alpine": failed to pull and unpack image "docker.io/library/nginx:alpine": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:b3c656d55d7ad751196f21b7fd2e8d4da9cb430e32f646adcf92441b72f82b14: 429 Too Many Requests
	toomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit
	  Normal   Pulling  11m (x5 over 14m)  kubelet  Pulling image "docker.io/nginx:alpine"
	  Warning  Failed   11m (x5 over 14m)  kubelet  Error: ErrImagePull
	  Warning  Failed   11m (x2 over 13m)  kubelet  Failed to pull image "docker.io/nginx:alpine": failed to pull and unpack image "docker.io/library/nginx:alpine": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:7391b3732e7f7ccd23ff1d02fbeadcde496f374d7460ad8e79260f8f6d2c9f90: 429 Too Many Requests
	toomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit
	  Normal   BackOff  4m7s (x42 over 14m)  kubelet  Back-off pulling image "docker.io/nginx:alpine"
	  Warning  Failed   4m7s (x42 over 14m)  kubelet  Error: ImagePullBackOff
	
	
	Name:             sp-pod
	Namespace:        default
	Priority:         0
	Service Account:  default
	Node:             functional-941011/192.168.49.2
	Start Time:       Mon, 24 Nov 2025 08:56:41 +0000
	Labels:           test=storage-provisioner
	Annotations:      <none>
	Status:           Pending
	IP:               10.244.0.6
	IPs:
	  IP:  10.244.0.6
	Containers:
	  myfrontend:
	    Container ID:   
	    Image:          docker.io/nginx
	    Image ID:       
	    Port:           <none>
	    Host Port:      <none>
	    State:          Waiting
	      Reason:       ImagePullBackOff
	    Ready:          False
	    Restart Count:  0
	    Environment:    <none>
	    Mounts:
	      /tmp/mount from mypd (rw)
	      /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-26mp7 (ro)
	Conditions:
	  Type                        Status
	  PodReadyToStartContainers   True 
	  Initialized                 True 
	  Ready                       False 
	  ContainersReady             False 
	  PodScheduled                True 
	Volumes:
	  mypd:
	    Type:       PersistentVolumeClaim (a reference to a PersistentVolumeClaim in the same namespace)
	    ClaimName:  myclaim
	    ReadOnly:   false
	  kube-api-access-26mp7:
	    Type:                    Projected (a volume that contains injected data from multiple sources)
	    TokenExpirationSeconds:  3607
	    ConfigMapName:           kube-root-ca.crt
	    Optional:                false
	    DownwardAPI:             true
	QoS Class:                   BestEffort
	Node-Selectors:              <none>
	Tolerations:                 node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
	                             node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
	Events:
	  Type     Reason     Age                From               Message
	  ----     ------     ----               ----               -------
	  Normal   Scheduled  14m                default-scheduler  Successfully assigned default/sp-pod to functional-941011
	  Normal   Pulling    11m (x5 over 14m)  kubelet            Pulling image "docker.io/nginx"
	  Warning  Failed     11m (x5 over 14m)  kubelet            Failed to pull image "docker.io/nginx": failed to pull and unpack image "docker.io/library/nginx:latest": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:553f64aecdc31b5bf944521731cd70e35da4faed96b2b7548a3d8e2598c52a42: 429 Too Many Requests
	toomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit
	  Warning  Failed   11m (x5 over 14m)    kubelet  Error: ErrImagePull
	  Normal   BackOff  4m5s (x42 over 14m)  kubelet  Back-off pulling image "docker.io/nginx"
	  Warning  Failed   4m5s (x42 over 14m)  kubelet  Error: ImagePullBackOff

                                                
                                                
-- /stdout --
** stderr ** 
	Error from server (NotFound): pods "dashboard-metrics-scraper-77bf4d6c4c-5qb25" not found
	Error from server (NotFound): pods "kubernetes-dashboard-855c9754f9-tww5k" not found

                                                
                                                
** /stderr **
helpers_test.go:287: kubectl --context functional-941011 describe pod busybox-mount hello-node-connect-7d85dfc575-vjgqt nginx-svc sp-pod dashboard-metrics-scraper-77bf4d6c4c-5qb25 kubernetes-dashboard-855c9754f9-tww5k: exit status 1
--- FAIL: TestFunctional/parallel/ServiceCmdConnect (603.67s)

                                                
                                    
x
+
TestFunctional/parallel/PersistentVolumeClaim (249.67s)

                                                
                                                
=== RUN   TestFunctional/parallel/PersistentVolumeClaim
=== PAUSE TestFunctional/parallel/PersistentVolumeClaim

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/PersistentVolumeClaim
functional_test_pvc_test.go:50: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 4m0s for pods matching "integration-test=storage-provisioner" in namespace "kube-system" ...
helpers_test.go:352: "storage-provisioner" [13c14b20-a1ad-4212-98a5-325e1115d97a] Running
functional_test_pvc_test.go:50: (dbg) TestFunctional/parallel/PersistentVolumeClaim: integration-test=storage-provisioner healthy within 6.003634888s
functional_test_pvc_test.go:55: (dbg) Run:  kubectl --context functional-941011 get storageclass -o=json
functional_test_pvc_test.go:75: (dbg) Run:  kubectl --context functional-941011 apply -f testdata/storage-provisioner/pvc.yaml
functional_test_pvc_test.go:82: (dbg) Run:  kubectl --context functional-941011 get pvc myclaim -o=json
functional_test_pvc_test.go:131: (dbg) Run:  kubectl --context functional-941011 apply -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:140: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 4m0s for pods matching "test=storage-provisioner" in namespace "default" ...
helpers_test.go:352: "sp-pod" [8483abe7-6e51-4bf9-9b52-141abe46cd3e] Pending
helpers_test.go:352: "sp-pod" [8483abe7-6e51-4bf9-9b52-141abe46cd3e] Pending / Ready:ContainersNotReady (containers with unready status: [myfrontend]) / ContainersReady:ContainersNotReady (containers with unready status: [myfrontend])
E1124 08:56:44.586817 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 08:57:25.548189 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 08:58:47.469816 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:337: TestFunctional/parallel/PersistentVolumeClaim: WARNING: pod list for "default" "test=storage-provisioner" returned: client rate limiter Wait returned an error: context deadline exceeded
functional_test_pvc_test.go:140: ***** TestFunctional/parallel/PersistentVolumeClaim: pod "test=storage-provisioner" failed to start within 4m0s: context deadline exceeded ****
functional_test_pvc_test.go:140: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-941011 -n functional-941011
functional_test_pvc_test.go:140: TestFunctional/parallel/PersistentVolumeClaim: showing logs for failed pods as of 2025-11-24 09:00:41.478922672 +0000 UTC m=+1056.846229210
functional_test_pvc_test.go:140: (dbg) Run:  kubectl --context functional-941011 describe po sp-pod -n default
functional_test_pvc_test.go:140: (dbg) kubectl --context functional-941011 describe po sp-pod -n default:
Name:             sp-pod
Namespace:        default
Priority:         0
Service Account:  default
Node:             functional-941011/192.168.49.2
Start Time:       Mon, 24 Nov 2025 08:56:41 +0000
Labels:           test=storage-provisioner
Annotations:      <none>
Status:           Pending
IP:               10.244.0.6
IPs:
IP:  10.244.0.6
Containers:
myfrontend:
Container ID:   
Image:          docker.io/nginx
Image ID:       
Port:           <none>
Host Port:      <none>
State:          Waiting
Reason:       ImagePullBackOff
Ready:          False
Restart Count:  0
Environment:    <none>
Mounts:
/tmp/mount from mypd (rw)
/var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-26mp7 (ro)
Conditions:
Type                        Status
PodReadyToStartContainers   True 
Initialized                 True 
Ready                       False 
ContainersReady             False 
PodScheduled                True 
Volumes:
mypd:
Type:       PersistentVolumeClaim (a reference to a PersistentVolumeClaim in the same namespace)
ClaimName:  myclaim
ReadOnly:   false
kube-api-access-26mp7:
Type:                    Projected (a volume that contains injected data from multiple sources)
TokenExpirationSeconds:  3607
ConfigMapName:           kube-root-ca.crt
Optional:                false
DownwardAPI:             true
QoS Class:                   BestEffort
Node-Selectors:              <none>
Tolerations:                 node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
Events:
Type     Reason     Age                  From               Message
----     ------     ----                 ----               -------
Normal   Scheduled  4m                   default-scheduler  Successfully assigned default/sp-pod to functional-941011
Normal   Pulling    64s (x5 over 4m)     kubelet            Pulling image "docker.io/nginx"
Warning  Failed     64s (x5 over 3m59s)  kubelet            Failed to pull image "docker.io/nginx": failed to pull and unpack image "docker.io/library/nginx:latest": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:553f64aecdc31b5bf944521731cd70e35da4faed96b2b7548a3d8e2598c52a42: 429 Too Many Requests
toomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit
Warning  Failed   64s (x5 over 3m59s)  kubelet  Error: ErrImagePull
Normal   BackOff  0s (x15 over 3m59s)  kubelet  Back-off pulling image "docker.io/nginx"
Warning  Failed   0s (x15 over 3m59s)  kubelet  Error: ImagePullBackOff
functional_test_pvc_test.go:140: (dbg) Run:  kubectl --context functional-941011 logs sp-pod -n default
functional_test_pvc_test.go:140: (dbg) Non-zero exit: kubectl --context functional-941011 logs sp-pod -n default: exit status 1 (101.931436ms)

                                                
                                                
** stderr ** 
	Error from server (BadRequest): container "myfrontend" in pod "sp-pod" is waiting to start: trying and failing to pull image

                                                
                                                
** /stderr **
functional_test_pvc_test.go:140: kubectl --context functional-941011 logs sp-pod -n default: exit status 1
functional_test_pvc_test.go:141: failed waiting for pvctest pod : test=storage-provisioner within 4m0s: context deadline exceeded
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestFunctional/parallel/PersistentVolumeClaim]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestFunctional/parallel/PersistentVolumeClaim]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect functional-941011
helpers_test.go:243: (dbg) docker inspect functional-941011:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "d8574c2bf48ce967fe6e255a178ebf105b1a3019c24cf41f2b79f5302c127773",
	        "Created": "2025-11-24T08:53:47.57593314Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 1679494,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-11-24T08:53:47.634288317Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:572c983e466f1f784136812eef5cc59ac623db764bc7704d3676c4643993fd08",
	        "ResolvConfPath": "/var/lib/docker/containers/d8574c2bf48ce967fe6e255a178ebf105b1a3019c24cf41f2b79f5302c127773/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/d8574c2bf48ce967fe6e255a178ebf105b1a3019c24cf41f2b79f5302c127773/hostname",
	        "HostsPath": "/var/lib/docker/containers/d8574c2bf48ce967fe6e255a178ebf105b1a3019c24cf41f2b79f5302c127773/hosts",
	        "LogPath": "/var/lib/docker/containers/d8574c2bf48ce967fe6e255a178ebf105b1a3019c24cf41f2b79f5302c127773/d8574c2bf48ce967fe6e255a178ebf105b1a3019c24cf41f2b79f5302c127773-json.log",
	        "Name": "/functional-941011",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-941011:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-941011",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "d8574c2bf48ce967fe6e255a178ebf105b1a3019c24cf41f2b79f5302c127773",
	                "LowerDir": "/var/lib/docker/overlay2/6eeb35a95c7cafb92e460f903c5570446f5d8f4a8b4814f2aec79043c07ef0d3-init/diff:/var/lib/docker/overlay2/bf294786c7ade59cb761c7bdcc27045362c3779b00a569b685443f34ade17737/diff",
	                "MergedDir": "/var/lib/docker/overlay2/6eeb35a95c7cafb92e460f903c5570446f5d8f4a8b4814f2aec79043c07ef0d3/merged",
	                "UpperDir": "/var/lib/docker/overlay2/6eeb35a95c7cafb92e460f903c5570446f5d8f4a8b4814f2aec79043c07ef0d3/diff",
	                "WorkDir": "/var/lib/docker/overlay2/6eeb35a95c7cafb92e460f903c5570446f5d8f4a8b4814f2aec79043c07ef0d3/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-941011",
	                "Source": "/var/lib/docker/volumes/functional-941011/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-941011",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-941011",
	                "name.minikube.sigs.k8s.io": "functional-941011",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "91cbecc0d651d94558cb202589b12e740389d40de185d06770e23f82cb68fc8d",
	            "SandboxKey": "/var/run/docker/netns/91cbecc0d651",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34679"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34680"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34683"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34681"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34682"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-941011": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "36:03:d7:7f:e5:c7",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "e7f7867899e274c20f652612139490d61ff49918c5fef46ebcab3194d02671b8",
	                    "EndpointID": "ca16f2cc76565150d8b128df549a1bd659112397b50cb5fa5c6631e2b78b03b5",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-941011",
	                        "d8574c2bf48c"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-941011 -n functional-941011
helpers_test.go:252: <<< TestFunctional/parallel/PersistentVolumeClaim FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestFunctional/parallel/PersistentVolumeClaim]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 logs -n 25
helpers_test.go:255: (dbg) Done: out/minikube-linux-arm64 -p functional-941011 logs -n 25: (1.584821405s)
helpers_test.go:260: TestFunctional/parallel/PersistentVolumeClaim logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                              ARGS                                                                               │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ ssh     │ functional-941011 ssh sudo cat /etc/ssl/certs/16544672.pem                                                                                                      │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 08:56 UTC │ 24 Nov 25 08:56 UTC │
	│ image   │ functional-941011 image ls                                                                                                                                      │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 08:56 UTC │ 24 Nov 25 08:56 UTC │
	│ ssh     │ functional-941011 ssh sudo cat /usr/share/ca-certificates/16544672.pem                                                                                          │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 08:56 UTC │ 24 Nov 25 08:56 UTC │
	│ image   │ functional-941011 image load --daemon kicbase/echo-server:functional-941011 --alsologtostderr                                                                   │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 08:56 UTC │ 24 Nov 25 08:56 UTC │
	│ ssh     │ functional-941011 ssh sudo cat /etc/ssl/certs/3ec20f2e.0                                                                                                        │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 08:56 UTC │ 24 Nov 25 08:56 UTC │
	│ ssh     │ functional-941011 ssh sudo cat /etc/test/nested/copy/1654467/hosts                                                                                              │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 08:56 UTC │ 24 Nov 25 08:56 UTC │
	│ image   │ functional-941011 image ls                                                                                                                                      │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 08:56 UTC │ 24 Nov 25 08:56 UTC │
	│ image   │ functional-941011 image load --daemon kicbase/echo-server:functional-941011 --alsologtostderr                                                                   │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 08:56 UTC │ 24 Nov 25 08:56 UTC │
	│ image   │ functional-941011 image ls                                                                                                                                      │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 08:56 UTC │ 24 Nov 25 08:56 UTC │
	│ image   │ functional-941011 image save kicbase/echo-server:functional-941011 /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 08:56 UTC │ 24 Nov 25 08:56 UTC │
	│ image   │ functional-941011 image rm kicbase/echo-server:functional-941011 --alsologtostderr                                                                              │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 08:56 UTC │ 24 Nov 25 08:56 UTC │
	│ image   │ functional-941011 image ls                                                                                                                                      │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 08:56 UTC │ 24 Nov 25 08:56 UTC │
	│ image   │ functional-941011 image load /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr                                       │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 08:56 UTC │ 24 Nov 25 08:56 UTC │
	│ image   │ functional-941011 image ls                                                                                                                                      │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 08:56 UTC │ 24 Nov 25 08:56 UTC │
	│ image   │ functional-941011 image save --daemon kicbase/echo-server:functional-941011 --alsologtostderr                                                                   │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 08:56 UTC │ 24 Nov 25 08:56 UTC │
	│ ssh     │ functional-941011 ssh echo hello                                                                                                                                │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 08:56 UTC │ 24 Nov 25 08:56 UTC │
	│ ssh     │ functional-941011 ssh cat /etc/hostname                                                                                                                         │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 08:56 UTC │ 24 Nov 25 08:56 UTC │
	│ tunnel  │ functional-941011 tunnel --alsologtostderr                                                                                                                      │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 08:56 UTC │                     │
	│ tunnel  │ functional-941011 tunnel --alsologtostderr                                                                                                                      │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 08:56 UTC │                     │
	│ tunnel  │ functional-941011 tunnel --alsologtostderr                                                                                                                      │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 08:56 UTC │                     │
	│ service │ functional-941011 service list                                                                                                                                  │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 08:56 UTC │ 24 Nov 25 08:56 UTC │
	│ service │ functional-941011 service list -o json                                                                                                                          │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 08:56 UTC │ 24 Nov 25 08:56 UTC │
	│ service │ functional-941011 service --namespace=default --https --url hello-node                                                                                          │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 08:56 UTC │ 24 Nov 25 08:56 UTC │
	│ service │ functional-941011 service hello-node --url --format={{.IP}}                                                                                                     │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 08:56 UTC │ 24 Nov 25 08:56 UTC │
	│ service │ functional-941011 service hello-node --url                                                                                                                      │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 08:56 UTC │ 24 Nov 25 08:56 UTC │
	└─────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/11/24 08:55:17
	Running on machine: ip-172-31-30-239
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1124 08:55:17.468234 1683796 out.go:360] Setting OutFile to fd 1 ...
	I1124 08:55:17.468335 1683796 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 08:55:17.468338 1683796 out.go:374] Setting ErrFile to fd 2...
	I1124 08:55:17.468343 1683796 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 08:55:17.468596 1683796 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
	I1124 08:55:17.468944 1683796 out.go:368] Setting JSON to false
	I1124 08:55:17.469918 1683796 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":27447,"bootTime":1763947071,"procs":184,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1124 08:55:17.469982 1683796 start.go:143] virtualization:  
	I1124 08:55:17.473582 1683796 out.go:179] * [functional-941011] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1124 08:55:17.476672 1683796 out.go:179]   - MINIKUBE_LOCATION=21978
	I1124 08:55:17.476747 1683796 notify.go:221] Checking for updates...
	I1124 08:55:17.483440 1683796 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1124 08:55:17.486395 1683796 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 08:55:17.489233 1683796 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21978-1652607/.minikube
	I1124 08:55:17.492140 1683796 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1124 08:55:17.495020 1683796 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1124 08:55:17.498364 1683796 config.go:182] Loaded profile config "functional-941011": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
	I1124 08:55:17.498504 1683796 driver.go:422] Setting default libvirt URI to qemu:///system
	I1124 08:55:17.523705 1683796 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1124 08:55:17.523824 1683796 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 08:55:17.592647 1683796 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:2 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:40 OomKillDisable:true NGoroutines:65 SystemTime:2025-11-24 08:55:17.582430411 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 08:55:17.592748 1683796 docker.go:319] overlay module found
	I1124 08:55:17.595887 1683796 out.go:179] * Using the docker driver based on existing profile
	I1124 08:55:17.598655 1683796 start.go:309] selected driver: docker
	I1124 08:55:17.598682 1683796 start.go:927] validating driver "docker" against &{Name:functional-941011 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.2 ClusterName:functional-941011 Namespace:default APIServerHAVIP: APIServerNa
me:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizat
ions:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 08:55:17.598780 1683796 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1124 08:55:17.599329 1683796 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 08:55:17.668506 1683796 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:2 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:40 OomKillDisable:true NGoroutines:65 SystemTime:2025-11-24 08:55:17.658593975 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 08:55:17.668882 1683796 start_flags.go:992] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1124 08:55:17.668920 1683796 cni.go:84] Creating CNI manager for ""
	I1124 08:55:17.668976 1683796 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1124 08:55:17.669035 1683796 start.go:353] cluster config:
	{Name:functional-941011 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.2 ClusterName:functional-941011 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Containe
rRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizati
ons:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 08:55:17.672240 1683796 out.go:179] * Starting "functional-941011" primary control-plane node in "functional-941011" cluster
	I1124 08:55:17.675055 1683796 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1124 08:55:17.677904 1683796 out.go:179] * Pulling base image v0.0.48-1763789673-21948 ...
	I1124 08:55:17.680669 1683796 preload.go:188] Checking if preload exists for k8s version v1.34.2 and runtime containerd
	I1124 08:55:17.680706 1683796 preload.go:203] Found local preload: /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.2-containerd-overlay2-arm64.tar.lz4
	I1124 08:55:17.680714 1683796 cache.go:65] Caching tarball of preloaded images
	I1124 08:55:17.680813 1683796 preload.go:238] Found /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.2-containerd-overlay2-arm64.tar.lz4 in cache, skipping download
	I1124 08:55:17.680822 1683796 cache.go:68] Finished verifying existence of preloaded tar for v1.34.2 on containerd
	I1124 08:55:17.680932 1683796 profile.go:143] Saving config to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/config.json ...
	I1124 08:55:17.681166 1683796 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f in local docker daemon
	I1124 08:55:17.700322 1683796 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f in local docker daemon, skipping pull
	I1124 08:55:17.700333 1683796 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f exists in daemon, skipping load
	I1124 08:55:17.700355 1683796 cache.go:243] Successfully downloaded all kic artifacts
	I1124 08:55:17.700386 1683796 start.go:360] acquireMachinesLock for functional-941011: {Name:mk8b22cdd127df99565a1c745f9af2ae45583af3 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 08:55:17.700457 1683796 start.go:364] duration metric: took 48.108µs to acquireMachinesLock for "functional-941011"
	I1124 08:55:17.700476 1683796 start.go:96] Skipping create...Using existing machine configuration
	I1124 08:55:17.700480 1683796 fix.go:54] fixHost starting: 
	I1124 08:55:17.700749 1683796 cli_runner.go:164] Run: docker container inspect functional-941011 --format={{.State.Status}}
	I1124 08:55:17.717928 1683796 fix.go:112] recreateIfNeeded on functional-941011: state=Running err=<nil>
	W1124 08:55:17.717947 1683796 fix.go:138] unexpected machine state, will restart: <nil>
	I1124 08:55:17.721202 1683796 out.go:252] * Updating the running docker "functional-941011" container ...
	I1124 08:55:17.721242 1683796 machine.go:94] provisionDockerMachine start ...
	I1124 08:55:17.721322 1683796 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-941011
	I1124 08:55:17.738999 1683796 main.go:143] libmachine: Using SSH client type: native
	I1124 08:55:17.739339 1683796 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34679 <nil> <nil>}
	I1124 08:55:17.739346 1683796 main.go:143] libmachine: About to run SSH command:
	hostname
	I1124 08:55:17.890021 1683796 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-941011
	
	I1124 08:55:17.890036 1683796 ubuntu.go:182] provisioning hostname "functional-941011"
	I1124 08:55:17.890105 1683796 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-941011
	I1124 08:55:17.909461 1683796 main.go:143] libmachine: Using SSH client type: native
	I1124 08:55:17.909769 1683796 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34679 <nil> <nil>}
	I1124 08:55:17.909778 1683796 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-941011 && echo "functional-941011" | sudo tee /etc/hostname
	I1124 08:55:18.073329 1683796 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-941011
	
	I1124 08:55:18.073432 1683796 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-941011
	I1124 08:55:18.092923 1683796 main.go:143] libmachine: Using SSH client type: native
	I1124 08:55:18.093250 1683796 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34679 <nil> <nil>}
	I1124 08:55:18.093264 1683796 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-941011' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-941011/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-941011' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1124 08:55:18.242760 1683796 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1124 08:55:18.242776 1683796 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/21978-1652607/.minikube CaCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/21978-1652607/.minikube}
	I1124 08:55:18.242794 1683796 ubuntu.go:190] setting up certificates
	I1124 08:55:18.242802 1683796 provision.go:84] configureAuth start
	I1124 08:55:18.242861 1683796 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-941011
	I1124 08:55:18.262769 1683796 provision.go:143] copyHostCerts
	I1124 08:55:18.262838 1683796 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem, removing ...
	I1124 08:55:18.262851 1683796 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem
	I1124 08:55:18.262915 1683796 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem (1078 bytes)
	I1124 08:55:18.263011 1683796 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem, removing ...
	I1124 08:55:18.263015 1683796 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem
	I1124 08:55:18.263035 1683796 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem (1123 bytes)
	I1124 08:55:18.263126 1683796 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem, removing ...
	I1124 08:55:18.263130 1683796 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem
	I1124 08:55:18.263148 1683796 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem (1679 bytes)
	I1124 08:55:18.263207 1683796 provision.go:117] generating server cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem org=jenkins.functional-941011 san=[127.0.0.1 192.168.49.2 functional-941011 localhost minikube]
	I1124 08:55:18.726475 1683796 provision.go:177] copyRemoteCerts
	I1124 08:55:18.726531 1683796 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1124 08:55:18.726570 1683796 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-941011
	I1124 08:55:18.745182 1683796 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34679 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-941011/id_rsa Username:docker}
	I1124 08:55:18.850504 1683796 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1124 08:55:18.868108 1683796 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1124 08:55:18.887588 1683796 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1124 08:55:18.905442 1683796 provision.go:87] duration metric: took 662.617921ms to configureAuth
	I1124 08:55:18.905460 1683796 ubuntu.go:206] setting minikube options for container-runtime
	I1124 08:55:18.905664 1683796 config.go:182] Loaded profile config "functional-941011": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
	I1124 08:55:18.905669 1683796 machine.go:97] duration metric: took 1.184422678s to provisionDockerMachine
	I1124 08:55:18.905676 1683796 start.go:293] postStartSetup for "functional-941011" (driver="docker")
	I1124 08:55:18.905684 1683796 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1124 08:55:18.905729 1683796 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1124 08:55:18.905771 1683796 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-941011
	I1124 08:55:18.923021 1683796 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34679 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-941011/id_rsa Username:docker}
	I1124 08:55:19.030906 1683796 ssh_runner.go:195] Run: cat /etc/os-release
	I1124 08:55:19.034640 1683796 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1124 08:55:19.034664 1683796 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1124 08:55:19.034675 1683796 filesync.go:126] Scanning /home/jenkins/minikube-integration/21978-1652607/.minikube/addons for local assets ...
	I1124 08:55:19.034728 1683796 filesync.go:126] Scanning /home/jenkins/minikube-integration/21978-1652607/.minikube/files for local assets ...
	I1124 08:55:19.034803 1683796 filesync.go:149] local asset: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem -> 16544672.pem in /etc/ssl/certs
	I1124 08:55:19.034899 1683796 filesync.go:149] local asset: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/test/nested/copy/1654467/hosts -> hosts in /etc/test/nested/copy/1654467
	I1124 08:55:19.034943 1683796 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/1654467
	I1124 08:55:19.043522 1683796 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem --> /etc/ssl/certs/16544672.pem (1708 bytes)
	I1124 08:55:19.061468 1683796 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/test/nested/copy/1654467/hosts --> /etc/test/nested/copy/1654467/hosts (40 bytes)
	I1124 08:55:19.080169 1683796 start.go:296] duration metric: took 174.478774ms for postStartSetup
	I1124 08:55:19.080250 1683796 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1124 08:55:19.080290 1683796 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-941011
	I1124 08:55:19.097004 1683796 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34679 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-941011/id_rsa Username:docker}
	I1124 08:55:19.200130 1683796 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1124 08:55:19.205226 1683796 fix.go:56] duration metric: took 1.504732262s for fixHost
	I1124 08:55:19.205241 1683796 start.go:83] releasing machines lock for "functional-941011", held for 1.504777006s
	I1124 08:55:19.205308 1683796 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-941011
	I1124 08:55:19.221901 1683796 ssh_runner.go:195] Run: cat /version.json
	I1124 08:55:19.221947 1683796 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-941011
	I1124 08:55:19.222202 1683796 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1124 08:55:19.222261 1683796 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-941011
	I1124 08:55:19.243229 1683796 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34679 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-941011/id_rsa Username:docker}
	I1124 08:55:19.256050 1683796 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34679 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-941011/id_rsa Username:docker}
	I1124 08:55:19.350647 1683796 ssh_runner.go:195] Run: systemctl --version
	I1124 08:55:19.440107 1683796 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1124 08:55:19.444563 1683796 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1124 08:55:19.444641 1683796 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1124 08:55:19.453123 1683796 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1124 08:55:19.453136 1683796 start.go:496] detecting cgroup driver to use...
	I1124 08:55:19.453172 1683796 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1124 08:55:19.453236 1683796 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1124 08:55:19.468148 1683796 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1124 08:55:19.481311 1683796 docker.go:218] disabling cri-docker service (if available) ...
	I1124 08:55:19.481390 1683796 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1124 08:55:19.497523 1683796 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1124 08:55:19.511535 1683796 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1124 08:55:19.663748 1683796 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1124 08:55:19.822234 1683796 docker.go:234] disabling docker service ...
	I1124 08:55:19.822289 1683796 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1124 08:55:19.837519 1683796 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1124 08:55:19.850225 1683796 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1124 08:55:19.997624 1683796 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1124 08:55:20.153044 1683796 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1124 08:55:20.165726 1683796 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1124 08:55:20.180439 1683796 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm.sha256
	I1124 08:55:20.335739 1683796 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1124 08:55:20.345397 1683796 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1124 08:55:20.354286 1683796 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1124 08:55:20.354344 1683796 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1124 08:55:20.364059 1683796 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1124 08:55:20.373096 1683796 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1124 08:55:20.382044 1683796 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1124 08:55:20.391690 1683796 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1124 08:55:20.400009 1683796 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1124 08:55:20.408992 1683796 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1124 08:55:20.418044 1683796 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1124 08:55:20.427144 1683796 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1124 08:55:20.435729 1683796 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1124 08:55:20.443486 1683796 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1124 08:55:20.592467 1683796 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1124 08:55:20.925703 1683796 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1124 08:55:20.925775 1683796 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1124 08:55:20.929932 1683796 start.go:564] Will wait 60s for crictl version
	I1124 08:55:20.929990 1683796 ssh_runner.go:195] Run: which crictl
	I1124 08:55:20.934038 1683796 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1124 08:55:20.970075 1683796 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.1.5
	RuntimeApiVersion:  v1
	I1124 08:55:20.970162 1683796 ssh_runner.go:195] Run: containerd --version
	I1124 08:55:20.990325 1683796 ssh_runner.go:195] Run: containerd --version
	I1124 08:55:21.019054 1683796 out.go:179] * Preparing Kubernetes v1.34.2 on containerd 2.1.5 ...
	I1124 08:55:21.022022 1683796 cli_runner.go:164] Run: docker network inspect functional-941011 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1124 08:55:21.039118 1683796 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1124 08:55:21.046724 1683796 out.go:179]   - apiserver.enable-admission-plugins=NamespaceAutoProvision
	I1124 08:55:21.049537 1683796 kubeadm.go:884] updating cluster {Name:functional-941011 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.2 ClusterName:functional-941011 Namespace:default APIServerHAVIP: APIServerName:minikubeCA API
ServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort
:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1124 08:55:21.049813 1683796 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm.sha256
	I1124 08:55:21.205009 1683796 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm.sha256
	I1124 08:55:21.366747 1683796 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm.sha256
	I1124 08:55:21.518037 1683796 preload.go:188] Checking if preload exists for k8s version v1.34.2 and runtime containerd
	I1124 08:55:21.518221 1683796 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm.sha256
	I1124 08:55:21.693357 1683796 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm.sha256
	I1124 08:55:21.845631 1683796 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubeadm.sha256
	I1124 08:55:22.020654 1683796 ssh_runner.go:195] Run: sudo crictl images --output json
	I1124 08:55:22.056751 1683796 containerd.go:627] all images are preloaded for containerd runtime.
	I1124 08:55:22.056764 1683796 containerd.go:534] Images already preloaded, skipping extraction
	I1124 08:55:22.056825 1683796 ssh_runner.go:195] Run: sudo crictl images --output json
	I1124 08:55:22.081759 1683796 containerd.go:627] all images are preloaded for containerd runtime.
	I1124 08:55:22.081771 1683796 cache_images.go:86] Images are preloaded, skipping loading
	I1124 08:55:22.081777 1683796 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.34.2 containerd true true} ...
	I1124 08:55:22.081898 1683796 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.34.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-941011 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.34.2 ClusterName:functional-941011 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1124 08:55:22.081965 1683796 ssh_runner.go:195] Run: sudo crictl info
	I1124 08:55:22.112338 1683796 extraconfig.go:125] Overwriting default enable-admission-plugins=NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota with user provided enable-admission-plugins=NamespaceAutoProvision for component apiserver
	I1124 08:55:22.112356 1683796 cni.go:84] Creating CNI manager for ""
	I1124 08:55:22.112365 1683796 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1124 08:55:22.112373 1683796 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1124 08:55:22.112395 1683796 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.34.2 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-941011 NodeName:functional-941011 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceAutoProvision] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfi
gOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1124 08:55:22.112508 1683796 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-941011"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceAutoProvision"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.34.2
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1124 08:55:22.112572 1683796 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.34.2
	I1124 08:55:22.120785 1683796 binaries.go:51] Found k8s binaries, skipping transfer
	I1124 08:55:22.120849 1683796 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1124 08:55:22.128697 1683796 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (321 bytes)
	I1124 08:55:22.142331 1683796 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I1124 08:55:22.155492 1683796 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2080 bytes)
	I1124 08:55:22.168924 1683796 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1124 08:55:22.172727 1683796 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1124 08:55:22.309085 1683796 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1124 08:55:22.323875 1683796 certs.go:69] Setting up /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011 for IP: 192.168.49.2
	I1124 08:55:22.323885 1683796 certs.go:195] generating shared ca certs ...
	I1124 08:55:22.323899 1683796 certs.go:227] acquiring lock for ca certs: {Name:mkbe540a30c4376a351176f7fe6fec044d058b09 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 08:55:22.324039 1683796 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.key
	I1124 08:55:22.324084 1683796 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.key
	I1124 08:55:22.324091 1683796 certs.go:257] generating profile certs ...
	I1124 08:55:22.324182 1683796 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/client.key
	I1124 08:55:22.324230 1683796 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/apiserver.key.ba079cdf
	I1124 08:55:22.324268 1683796 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/proxy-client.key
	I1124 08:55:22.324376 1683796 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467.pem (1338 bytes)
	W1124 08:55:22.324404 1683796 certs.go:480] ignoring /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467_empty.pem, impossibly tiny 0 bytes
	I1124 08:55:22.324411 1683796 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem (1671 bytes)
	I1124 08:55:22.324436 1683796 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem (1078 bytes)
	I1124 08:55:22.324461 1683796 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem (1123 bytes)
	I1124 08:55:22.324483 1683796 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem (1679 bytes)
	I1124 08:55:22.324524 1683796 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem (1708 bytes)
	I1124 08:55:22.325136 1683796 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1124 08:55:22.351301 1683796 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1124 08:55:22.376977 1683796 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1124 08:55:22.398430 1683796 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1124 08:55:22.418593 1683796 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1124 08:55:22.437375 1683796 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1124 08:55:22.457764 1683796 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1124 08:55:22.477819 1683796 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1124 08:55:22.495804 1683796 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467.pem --> /usr/share/ca-certificates/1654467.pem (1338 bytes)
	I1124 08:55:22.514573 1683796 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem --> /usr/share/ca-certificates/16544672.pem (1708 bytes)
	I1124 08:55:22.532398 1683796 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1124 08:55:22.551242 1683796 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1124 08:55:22.564976 1683796 ssh_runner.go:195] Run: openssl version
	I1124 08:55:22.571897 1683796 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1654467.pem && ln -fs /usr/share/ca-certificates/1654467.pem /etc/ssl/certs/1654467.pem"
	I1124 08:55:22.583269 1683796 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1654467.pem
	I1124 08:55:22.587414 1683796 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Nov 24 08:53 /usr/share/ca-certificates/1654467.pem
	I1124 08:55:22.587470 1683796 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1654467.pem
	I1124 08:55:22.628864 1683796 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1654467.pem /etc/ssl/certs/51391683.0"
	I1124 08:55:22.637997 1683796 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/16544672.pem && ln -fs /usr/share/ca-certificates/16544672.pem /etc/ssl/certs/16544672.pem"
	I1124 08:55:22.647140 1683796 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/16544672.pem
	I1124 08:55:22.651271 1683796 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Nov 24 08:53 /usr/share/ca-certificates/16544672.pem
	I1124 08:55:22.651331 1683796 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/16544672.pem
	I1124 08:55:22.694011 1683796 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/16544672.pem /etc/ssl/certs/3ec20f2e.0"
	I1124 08:55:22.702228 1683796 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1124 08:55:22.711092 1683796 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1124 08:55:22.714950 1683796 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Nov 24 08:44 /usr/share/ca-certificates/minikubeCA.pem
	I1124 08:55:22.715012 1683796 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1124 08:55:22.757131 1683796 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1124 08:55:22.765138 1683796 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1124 08:55:22.769200 1683796 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1124 08:55:22.810951 1683796 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1124 08:55:22.853152 1683796 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1124 08:55:22.895132 1683796 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1124 08:55:22.936521 1683796 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1124 08:55:22.978772 1683796 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1124 08:55:23.020028 1683796 kubeadm.go:401] StartCluster: {Name:functional-941011 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.2 ClusterName:functional-941011 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APISer
verNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0
MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 08:55:23.020113 1683796 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1124 08:55:23.020192 1683796 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1124 08:55:23.049886 1683796 cri.go:89] found id: "c59aa8a0dc911d866d1284907405697c4c3891d4b4c4423e9f764fbf37efa9c9"
	I1124 08:55:23.049897 1683796 cri.go:89] found id: "05393cf0d0a6a4762c03e908fda17fafda4ed38bac67cdf553df9130af52fa37"
	I1124 08:55:23.049900 1683796 cri.go:89] found id: "ff28a2e89f65b3323fbc0f41262a0b445c865c0474fbdac4ec03f66b94a6826a"
	I1124 08:55:23.049903 1683796 cri.go:89] found id: "62155860688a446bc4f0edf281779e28f4ecf113640dbf9995722ce6bfe4cd55"
	I1124 08:55:23.049905 1683796 cri.go:89] found id: "39dbd124ad9131afd2e38d1cc6019bc4e02106f43653bf39b293c3e257811124"
	I1124 08:55:23.049908 1683796 cri.go:89] found id: "784ae9c9b61292b5cfcfba35d59469a6a91365290aa46304545499084ede8d7e"
	I1124 08:55:23.049910 1683796 cri.go:89] found id: "8a2c1ed065d22c98f59a6c1c4a6816d5635a1993d4b79170db5bdf930e6a5464"
	I1124 08:55:23.049912 1683796 cri.go:89] found id: "c67aa399f4b3099636ae5de0029e6f8778af29edd2cb80d538b3fce9f2752591"
	I1124 08:55:23.049915 1683796 cri.go:89] found id: ""
	I1124 08:55:23.049968 1683796 ssh_runner.go:195] Run: sudo runc --root /run/containerd/runc/k8s.io list -f json
	I1124 08:55:23.079726 1683796 cri.go:116] JSON = [{"ociVersion":"1.2.1","id":"05393cf0d0a6a4762c03e908fda17fafda4ed38bac67cdf553df9130af52fa37","pid":2099,"status":"running","bundle":"/run/containerd/io.containerd.runtime.v2.task/k8s.io/05393cf0d0a6a4762c03e908fda17fafda4ed38bac67cdf553df9130af52fa37","rootfs":"/run/containerd/io.containerd.runtime.v2.task/k8s.io/05393cf0d0a6a4762c03e908fda17fafda4ed38bac67cdf553df9130af52fa37/rootfs","created":"2025-11-24T08:54:58.675058599Z","annotations":{"io.kubernetes.cri.container-name":"storage-provisioner","io.kubernetes.cri.container-type":"container","io.kubernetes.cri.image-name":"gcr.io/k8s-minikube/storage-provisioner:v5","io.kubernetes.cri.sandbox-id":"1d6d04a32577cb2690c03ea4c47a4ae8d00628d8d6eda57c874304ed215955af","io.kubernetes.cri.sandbox-name":"storage-provisioner","io.kubernetes.cri.sandbox-namespace":"kube-system","io.kubernetes.cri.sandbox-uid":"13c14b20-a1ad-4212-98a5-325e1115d97a"},"owner":"root"},{"ociVersion":"1.2.1","id":"0fac3868adf86c6cd
5249817334e03f5da30b8e86da755a73d9399be4b347665","pid":1296,"status":"running","bundle":"/run/containerd/io.containerd.runtime.v2.task/k8s.io/0fac3868adf86c6cd5249817334e03f5da30b8e86da755a73d9399be4b347665","rootfs":"/run/containerd/io.containerd.runtime.v2.task/k8s.io/0fac3868adf86c6cd5249817334e03f5da30b8e86da755a73d9399be4b347665/rootfs","created":"2025-11-24T08:54:05.156880254Z","annotations":{"io.kubernetes.cri.container-type":"sandbox","io.kubernetes.cri.podsandbox.image-name":"registry.k8s.io/pause:3.10.1","io.kubernetes.cri.sandbox-cpu-period":"100000","io.kubernetes.cri.sandbox-cpu-quota":"0","io.kubernetes.cri.sandbox-cpu-shares":"102","io.kubernetes.cri.sandbox-id":"0fac3868adf86c6cd5249817334e03f5da30b8e86da755a73d9399be4b347665","io.kubernetes.cri.sandbox-log-directory":"/var/log/pods/kube-system_kube-scheduler-functional-941011_d818469beb8f10d11195f4baedc5ef8c","io.kubernetes.cri.sandbox-memory":"0","io.kubernetes.cri.sandbox-name":"kube-scheduler-functional-941011","io.kubernetes.cri.sandbox-n
amespace":"kube-system","io.kubernetes.cri.sandbox-uid":"d818469beb8f10d11195f4baedc5ef8c"},"owner":"root"},{"ociVersion":"1.2.1","id":"1d6d04a32577cb2690c03ea4c47a4ae8d00628d8d6eda57c874304ed215955af","pid":2022,"status":"running","bundle":"/run/containerd/io.containerd.runtime.v2.task/k8s.io/1d6d04a32577cb2690c03ea4c47a4ae8d00628d8d6eda57c874304ed215955af","rootfs":"/run/containerd/io.containerd.runtime.v2.task/k8s.io/1d6d04a32577cb2690c03ea4c47a4ae8d00628d8d6eda57c874304ed215955af/rootfs","created":"2025-11-24T08:54:58.532026681Z","annotations":{"io.kubernetes.cri.container-type":"sandbox","io.kubernetes.cri.podsandbox.image-name":"registry.k8s.io/pause:3.10.1","io.kubernetes.cri.sandbox-cpu-period":"100000","io.kubernetes.cri.sandbox-cpu-quota":"0","io.kubernetes.cri.sandbox-cpu-shares":"2","io.kubernetes.cri.sandbox-id":"1d6d04a32577cb2690c03ea4c47a4ae8d00628d8d6eda57c874304ed215955af","io.kubernetes.cri.sandbox-log-directory":"/var/log/pods/kube-system_storage-provisioner_13c14b20-a1ad-4212-98a5-325e111
5d97a","io.kubernetes.cri.sandbox-memory":"0","io.kubernetes.cri.sandbox-name":"storage-provisioner","io.kubernetes.cri.sandbox-namespace":"kube-system","io.kubernetes.cri.sandbox-uid":"13c14b20-a1ad-4212-98a5-325e1115d97a"},"owner":"root"},{"ociVersion":"1.2.1","id":"31e96b363c821d8fc29dc1cb352bf3573f9460d013794299fc7447452c6b41ef","pid":1655,"status":"running","bundle":"/run/containerd/io.containerd.runtime.v2.task/k8s.io/31e96b363c821d8fc29dc1cb352bf3573f9460d013794299fc7447452c6b41ef","rootfs":"/run/containerd/io.containerd.runtime.v2.task/k8s.io/31e96b363c821d8fc29dc1cb352bf3573f9460d013794299fc7447452c6b41ef/rootfs","created":"2025-11-24T08:54:17.199732112Z","annotations":{"io.kubernetes.cri.container-type":"sandbox","io.kubernetes.cri.podsandbox.image-name":"registry.k8s.io/pause:3.10.1","io.kubernetes.cri.sandbox-cpu-period":"100000","io.kubernetes.cri.sandbox-cpu-quota":"0","io.kubernetes.cri.sandbox-cpu-shares":"2","io.kubernetes.cri.sandbox-id":"31e96b363c821d8fc29dc1cb352bf3573f9460d013794299fc744
7452c6b41ef","io.kubernetes.cri.sandbox-log-directory":"/var/log/pods/kube-system_kube-proxy-kmdsq_d98705fa-0c92-4b81-a03b-02c36c100f80","io.kubernetes.cri.sandbox-memory":"0","io.kubernetes.cri.sandbox-name":"kube-proxy-kmdsq","io.kubernetes.cri.sandbox-namespace":"kube-system","io.kubernetes.cri.sandbox-uid":"d98705fa-0c92-4b81-a03b-02c36c100f80"},"owner":"root"},{"ociVersion":"1.2.1","id":"39dbd124ad9131afd2e38d1cc6019bc4e02106f43653bf39b293c3e257811124","pid":1416,"status":"running","bundle":"/run/containerd/io.containerd.runtime.v2.task/k8s.io/39dbd124ad9131afd2e38d1cc6019bc4e02106f43653bf39b293c3e257811124","rootfs":"/run/containerd/io.containerd.runtime.v2.task/k8s.io/39dbd124ad9131afd2e38d1cc6019bc4e02106f43653bf39b293c3e257811124/rootfs","created":"2025-11-24T08:54:05.369725182Z","annotations":{"io.kubernetes.cri.container-name":"kube-scheduler","io.kubernetes.cri.container-type":"container","io.kubernetes.cri.image-name":"registry.k8s.io/kube-scheduler:v1.34.2","io.kubernetes.cri.sandbox-id":"0fac38
68adf86c6cd5249817334e03f5da30b8e86da755a73d9399be4b347665","io.kubernetes.cri.sandbox-name":"kube-scheduler-functional-941011","io.kubernetes.cri.sandbox-namespace":"kube-system","io.kubernetes.cri.sandbox-uid":"d818469beb8f10d11195f4baedc5ef8c"},"owner":"root"},{"ociVersion":"1.2.1","id":"5f4eb46413316f2419943b36f8887735d96c24e947e5f24a833b39cb7126a31b","pid":1247,"status":"running","bundle":"/run/containerd/io.containerd.runtime.v2.task/k8s.io/5f4eb46413316f2419943b36f8887735d96c24e947e5f24a833b39cb7126a31b","rootfs":"/run/containerd/io.containerd.runtime.v2.task/k8s.io/5f4eb46413316f2419943b36f8887735d96c24e947e5f24a833b39cb7126a31b/rootfs","created":"2025-11-24T08:54:05.057180196Z","annotations":{"io.kubernetes.cri.container-type":"sandbox","io.kubernetes.cri.podsandbox.image-name":"registry.k8s.io/pause:3.10.1","io.kubernetes.cri.sandbox-cpu-period":"100000","io.kubernetes.cri.sandbox-cpu-quota":"0","io.kubernetes.cri.sandbox-cpu-shares":"204","io.kubernetes.cri.sandbox-id":"5f4eb46413316f2419943b36f888
7735d96c24e947e5f24a833b39cb7126a31b","io.kubernetes.cri.sandbox-log-directory":"/var/log/pods/kube-system_kube-controller-manager-functional-941011_f45edc56ceb2edbea8d62131f61a27c5","io.kubernetes.cri.sandbox-memory":"0","io.kubernetes.cri.sandbox-name":"kube-controller-manager-functional-941011","io.kubernetes.cri.sandbox-namespace":"kube-system","io.kubernetes.cri.sandbox-uid":"f45edc56ceb2edbea8d62131f61a27c5"},"owner":"root"},{"ociVersion":"1.2.1","id":"62155860688a446bc4f0edf281779e28f4ecf113640dbf9995722ce6bfe4cd55","pid":1733,"status":"running","bundle":"/run/containerd/io.containerd.runtime.v2.task/k8s.io/62155860688a446bc4f0edf281779e28f4ecf113640dbf9995722ce6bfe4cd55","rootfs":"/run/containerd/io.containerd.runtime.v2.task/k8s.io/62155860688a446bc4f0edf281779e28f4ecf113640dbf9995722ce6bfe4cd55/rootfs","created":"2025-11-24T08:54:17.45638532Z","annotations":{"io.kubernetes.cri.container-name":"kube-proxy","io.kubernetes.cri.container-type":"container","io.kubernetes.cri.image-name":"registry.k8s.io/
kube-proxy:v1.34.2","io.kubernetes.cri.sandbox-id":"31e96b363c821d8fc29dc1cb352bf3573f9460d013794299fc7447452c6b41ef","io.kubernetes.cri.sandbox-name":"kube-proxy-kmdsq","io.kubernetes.cri.sandbox-namespace":"kube-system","io.kubernetes.cri.sandbox-uid":"d98705fa-0c92-4b81-a03b-02c36c100f80"},"owner":"root"},{"ociVersion":"1.2.1","id":"63a1adeff106f91c197028326fa28f1d3fb7e75d4ec730fbd43845e2859206fe","pid":1673,"status":"running","bundle":"/run/containerd/io.containerd.runtime.v2.task/k8s.io/63a1adeff106f91c197028326fa28f1d3fb7e75d4ec730fbd43845e2859206fe","rootfs":"/run/containerd/io.containerd.runtime.v2.task/k8s.io/63a1adeff106f91c197028326fa28f1d3fb7e75d4ec730fbd43845e2859206fe/rootfs","created":"2025-11-24T08:54:17.314927923Z","annotations":{"io.kubernetes.cri.container-type":"sandbox","io.kubernetes.cri.podsandbox.image-name":"registry.k8s.io/pause:3.10.1","io.kubernetes.cri.sandbox-cpu-period":"100000","io.kubernetes.cri.sandbox-cpu-quota":"10000","io.kubernetes.cri.sandbox-cpu-shares":"102","io.kubern
etes.cri.sandbox-id":"63a1adeff106f91c197028326fa28f1d3fb7e75d4ec730fbd43845e2859206fe","io.kubernetes.cri.sandbox-log-directory":"/var/log/pods/kube-system_kindnet-vsrrw_3dccbc03-f381-4e6a-9c42-e221c48dc82b","io.kubernetes.cri.sandbox-memory":"52428800","io.kubernetes.cri.sandbox-name":"kindnet-vsrrw","io.kubernetes.cri.sandbox-namespace":"kube-system","io.kubernetes.cri.sandbox-uid":"3dccbc03-f381-4e6a-9c42-e221c48dc82b"},"owner":"root"},{"ociVersion":"1.2.1","id":"784ae9c9b61292b5cfcfba35d59469a6a91365290aa46304545499084ede8d7e","pid":1398,"status":"running","bundle":"/run/containerd/io.containerd.runtime.v2.task/k8s.io/784ae9c9b61292b5cfcfba35d59469a6a91365290aa46304545499084ede8d7e","rootfs":"/run/containerd/io.containerd.runtime.v2.task/k8s.io/784ae9c9b61292b5cfcfba35d59469a6a91365290aa46304545499084ede8d7e/rootfs","created":"2025-11-24T08:54:05.370590497Z","annotations":{"io.kubernetes.cri.container-name":"kube-apiserver","io.kubernetes.cri.container-type":"container","io.kubernetes.cri.image-name":"re
gistry.k8s.io/kube-apiserver:v1.34.2","io.kubernetes.cri.sandbox-id":"bd056617869f45bc86f625d12c70a5bb57a80728d460178c634e8500f33caa6c","io.kubernetes.cri.sandbox-name":"kube-apiserver-functional-941011","io.kubernetes.cri.sandbox-namespace":"kube-system","io.kubernetes.cri.sandbox-uid":"5373c2d87e202ee8fb227001fb54cf80"},"owner":"root"},{"ociVersion":"1.2.1","id":"8a2c1ed065d22c98f59a6c1c4a6816d5635a1993d4b79170db5bdf930e6a5464","pid":1363,"status":"running","bundle":"/run/containerd/io.containerd.runtime.v2.task/k8s.io/8a2c1ed065d22c98f59a6c1c4a6816d5635a1993d4b79170db5bdf930e6a5464","rootfs":"/run/containerd/io.containerd.runtime.v2.task/k8s.io/8a2c1ed065d22c98f59a6c1c4a6816d5635a1993d4b79170db5bdf930e6a5464/rootfs","created":"2025-11-24T08:54:05.290219846Z","annotations":{"io.kubernetes.cri.container-name":"kube-controller-manager","io.kubernetes.cri.container-type":"container","io.kubernetes.cri.image-name":"registry.k8s.io/kube-controller-manager:v1.34.2","io.kubernetes.cri.sandbox-id":"5f4eb46413316f24
19943b36f8887735d96c24e947e5f24a833b39cb7126a31b","io.kubernetes.cri.sandbox-name":"kube-controller-manager-functional-941011","io.kubernetes.cri.sandbox-namespace":"kube-system","io.kubernetes.cri.sandbox-uid":"f45edc56ceb2edbea8d62131f61a27c5"},"owner":"root"},{"ociVersion":"1.2.1","id":"abc7abcd5ce10cbacf352ea774b148d53af698f0d199f0022976bdac08da8136","pid":1198,"status":"running","bundle":"/run/containerd/io.containerd.runtime.v2.task/k8s.io/abc7abcd5ce10cbacf352ea774b148d53af698f0d199f0022976bdac08da8136","rootfs":"/run/containerd/io.containerd.runtime.v2.task/k8s.io/abc7abcd5ce10cbacf352ea774b148d53af698f0d199f0022976bdac08da8136/rootfs","created":"2025-11-24T08:54:05.000632619Z","annotations":{"io.kubernetes.cri.container-type":"sandbox","io.kubernetes.cri.podsandbox.image-name":"registry.k8s.io/pause:3.10.1","io.kubernetes.cri.sandbox-cpu-period":"100000","io.kubernetes.cri.sandbox-cpu-quota":"0","io.kubernetes.cri.sandbox-cpu-shares":"102","io.kubernetes.cri.sandbox-id":"abc7abcd5ce10cbacf352ea774b14
8d53af698f0d199f0022976bdac08da8136","io.kubernetes.cri.sandbox-log-directory":"/var/log/pods/kube-system_etcd-functional-941011_96beab94a517952a4ee430c6e2156e6f","io.kubernetes.cri.sandbox-memory":"0","io.kubernetes.cri.sandbox-name":"etcd-functional-941011","io.kubernetes.cri.sandbox-namespace":"kube-system","io.kubernetes.cri.sandbox-uid":"96beab94a517952a4ee430c6e2156e6f"},"owner":"root"},{"ociVersion":"1.2.1","id":"b1743c0f4c0960bc5012066e296265346c45ac42bb95c03d009712577c3c8885","pid":2080,"status":"running","bundle":"/run/containerd/io.containerd.runtime.v2.task/k8s.io/b1743c0f4c0960bc5012066e296265346c45ac42bb95c03d009712577c3c8885","rootfs":"/run/containerd/io.containerd.runtime.v2.task/k8s.io/b1743c0f4c0960bc5012066e296265346c45ac42bb95c03d009712577c3c8885/rootfs","created":"2025-11-24T08:54:58.621416595Z","annotations":{"io.kubernetes.cri.container-type":"sandbox","io.kubernetes.cri.podsandbox.image-name":"registry.k8s.io/pause:3.10.1","io.kubernetes.cri.sandbox-cpu-period":"100000","io.kubernetes.
cri.sandbox-cpu-quota":"0","io.kubernetes.cri.sandbox-cpu-shares":"102","io.kubernetes.cri.sandbox-id":"b1743c0f4c0960bc5012066e296265346c45ac42bb95c03d009712577c3c8885","io.kubernetes.cri.sandbox-log-directory":"/var/log/pods/kube-system_coredns-66bc5c9577-slkfz_4ef50cca-2036-4f7d-b4e9-c814be440acd","io.kubernetes.cri.sandbox-memory":"178257920","io.kubernetes.cri.sandbox-name":"coredns-66bc5c9577-slkfz","io.kubernetes.cri.sandbox-namespace":"kube-system","io.kubernetes.cri.sandbox-uid":"4ef50cca-2036-4f7d-b4e9-c814be440acd"},"owner":"root"},{"ociVersion":"1.2.1","id":"bd056617869f45bc86f625d12c70a5bb57a80728d460178c634e8500f33caa6c","pid":1276,"status":"running","bundle":"/run/containerd/io.containerd.runtime.v2.task/k8s.io/bd056617869f45bc86f625d12c70a5bb57a80728d460178c634e8500f33caa6c","rootfs":"/run/containerd/io.containerd.runtime.v2.task/k8s.io/bd056617869f45bc86f625d12c70a5bb57a80728d460178c634e8500f33caa6c/rootfs","created":"2025-11-24T08:54:05.099357625Z","annotations":{"io.kubernetes.cri.container
-type":"sandbox","io.kubernetes.cri.podsandbox.image-name":"registry.k8s.io/pause:3.10.1","io.kubernetes.cri.sandbox-cpu-period":"100000","io.kubernetes.cri.sandbox-cpu-quota":"0","io.kubernetes.cri.sandbox-cpu-shares":"256","io.kubernetes.cri.sandbox-id":"bd056617869f45bc86f625d12c70a5bb57a80728d460178c634e8500f33caa6c","io.kubernetes.cri.sandbox-log-directory":"/var/log/pods/kube-system_kube-apiserver-functional-941011_5373c2d87e202ee8fb227001fb54cf80","io.kubernetes.cri.sandbox-memory":"0","io.kubernetes.cri.sandbox-name":"kube-apiserver-functional-941011","io.kubernetes.cri.sandbox-namespace":"kube-system","io.kubernetes.cri.sandbox-uid":"5373c2d87e202ee8fb227001fb54cf80"},"owner":"root"},{"ociVersion":"1.2.1","id":"c59aa8a0dc911d866d1284907405697c4c3891d4b4c4423e9f764fbf37efa9c9","pid":2137,"status":"running","bundle":"/run/containerd/io.containerd.runtime.v2.task/k8s.io/c59aa8a0dc911d866d1284907405697c4c3891d4b4c4423e9f764fbf37efa9c9","rootfs":"/run/containerd/io.containerd.runtime.v2.task/k8s.io/c59aa8
a0dc911d866d1284907405697c4c3891d4b4c4423e9f764fbf37efa9c9/rootfs","created":"2025-11-24T08:54:58.806154617Z","annotations":{"io.kubernetes.cri.container-name":"coredns","io.kubernetes.cri.container-type":"container","io.kubernetes.cri.image-name":"registry.k8s.io/coredns/coredns:v1.12.1","io.kubernetes.cri.sandbox-id":"b1743c0f4c0960bc5012066e296265346c45ac42bb95c03d009712577c3c8885","io.kubernetes.cri.sandbox-name":"coredns-66bc5c9577-slkfz","io.kubernetes.cri.sandbox-namespace":"kube-system","io.kubernetes.cri.sandbox-uid":"4ef50cca-2036-4f7d-b4e9-c814be440acd"},"owner":"root"},{"ociVersion":"1.2.1","id":"c67aa399f4b3099636ae5de0029e6f8778af29edd2cb80d538b3fce9f2752591","pid":1323,"status":"running","bundle":"/run/containerd/io.containerd.runtime.v2.task/k8s.io/c67aa399f4b3099636ae5de0029e6f8778af29edd2cb80d538b3fce9f2752591","rootfs":"/run/containerd/io.containerd.runtime.v2.task/k8s.io/c67aa399f4b3099636ae5de0029e6f8778af29edd2cb80d538b3fce9f2752591/rootfs","created":"2025-11-24T08:54:05.189044791Z","ann
otations":{"io.kubernetes.cri.container-name":"etcd","io.kubernetes.cri.container-type":"container","io.kubernetes.cri.image-name":"registry.k8s.io/etcd:3.6.5-0","io.kubernetes.cri.sandbox-id":"abc7abcd5ce10cbacf352ea774b148d53af698f0d199f0022976bdac08da8136","io.kubernetes.cri.sandbox-name":"etcd-functional-941011","io.kubernetes.cri.sandbox-namespace":"kube-system","io.kubernetes.cri.sandbox-uid":"96beab94a517952a4ee430c6e2156e6f"},"owner":"root"},{"ociVersion":"1.2.1","id":"ff28a2e89f65b3323fbc0f41262a0b445c865c0474fbdac4ec03f66b94a6826a","pid":1779,"status":"running","bundle":"/run/containerd/io.containerd.runtime.v2.task/k8s.io/ff28a2e89f65b3323fbc0f41262a0b445c865c0474fbdac4ec03f66b94a6826a","rootfs":"/run/containerd/io.containerd.runtime.v2.task/k8s.io/ff28a2e89f65b3323fbc0f41262a0b445c865c0474fbdac4ec03f66b94a6826a/rootfs","created":"2025-11-24T08:54:17.644922941Z","annotations":{"io.kubernetes.cri.container-name":"kindnet-cni","io.kubernetes.cri.container-type":"container","io.kubernetes.cri.image-na
me":"docker.io/kindest/kindnetd:v20250512-df8de77b","io.kubernetes.cri.sandbox-id":"63a1adeff106f91c197028326fa28f1d3fb7e75d4ec730fbd43845e2859206fe","io.kubernetes.cri.sandbox-name":"kindnet-vsrrw","io.kubernetes.cri.sandbox-namespace":"kube-system","io.kubernetes.cri.sandbox-uid":"3dccbc03-f381-4e6a-9c42-e221c48dc82b"},"owner":"root"}]
	I1124 08:55:23.080024 1683796 cri.go:126] list returned 16 containers
	I1124 08:55:23.080032 1683796 cri.go:129] container: {ID:05393cf0d0a6a4762c03e908fda17fafda4ed38bac67cdf553df9130af52fa37 Status:running}
	I1124 08:55:23.080051 1683796 cri.go:135] skipping {05393cf0d0a6a4762c03e908fda17fafda4ed38bac67cdf553df9130af52fa37 running}: state = "running", want "paused"
	I1124 08:55:23.080058 1683796 cri.go:129] container: {ID:0fac3868adf86c6cd5249817334e03f5da30b8e86da755a73d9399be4b347665 Status:running}
	I1124 08:55:23.080063 1683796 cri.go:131] skipping 0fac3868adf86c6cd5249817334e03f5da30b8e86da755a73d9399be4b347665 - not in ps
	I1124 08:55:23.080068 1683796 cri.go:129] container: {ID:1d6d04a32577cb2690c03ea4c47a4ae8d00628d8d6eda57c874304ed215955af Status:running}
	I1124 08:55:23.080073 1683796 cri.go:131] skipping 1d6d04a32577cb2690c03ea4c47a4ae8d00628d8d6eda57c874304ed215955af - not in ps
	I1124 08:55:23.080076 1683796 cri.go:129] container: {ID:31e96b363c821d8fc29dc1cb352bf3573f9460d013794299fc7447452c6b41ef Status:running}
	I1124 08:55:23.080079 1683796 cri.go:131] skipping 31e96b363c821d8fc29dc1cb352bf3573f9460d013794299fc7447452c6b41ef - not in ps
	I1124 08:55:23.080082 1683796 cri.go:129] container: {ID:39dbd124ad9131afd2e38d1cc6019bc4e02106f43653bf39b293c3e257811124 Status:running}
	I1124 08:55:23.080087 1683796 cri.go:135] skipping {39dbd124ad9131afd2e38d1cc6019bc4e02106f43653bf39b293c3e257811124 running}: state = "running", want "paused"
	I1124 08:55:23.080092 1683796 cri.go:129] container: {ID:5f4eb46413316f2419943b36f8887735d96c24e947e5f24a833b39cb7126a31b Status:running}
	I1124 08:55:23.080097 1683796 cri.go:131] skipping 5f4eb46413316f2419943b36f8887735d96c24e947e5f24a833b39cb7126a31b - not in ps
	I1124 08:55:23.080099 1683796 cri.go:129] container: {ID:62155860688a446bc4f0edf281779e28f4ecf113640dbf9995722ce6bfe4cd55 Status:running}
	I1124 08:55:23.080102 1683796 cri.go:135] skipping {62155860688a446bc4f0edf281779e28f4ecf113640dbf9995722ce6bfe4cd55 running}: state = "running", want "paused"
	I1124 08:55:23.080106 1683796 cri.go:129] container: {ID:63a1adeff106f91c197028326fa28f1d3fb7e75d4ec730fbd43845e2859206fe Status:running}
	I1124 08:55:23.080110 1683796 cri.go:131] skipping 63a1adeff106f91c197028326fa28f1d3fb7e75d4ec730fbd43845e2859206fe - not in ps
	I1124 08:55:23.080113 1683796 cri.go:129] container: {ID:784ae9c9b61292b5cfcfba35d59469a6a91365290aa46304545499084ede8d7e Status:running}
	I1124 08:55:23.080118 1683796 cri.go:135] skipping {784ae9c9b61292b5cfcfba35d59469a6a91365290aa46304545499084ede8d7e running}: state = "running", want "paused"
	I1124 08:55:23.080121 1683796 cri.go:129] container: {ID:8a2c1ed065d22c98f59a6c1c4a6816d5635a1993d4b79170db5bdf930e6a5464 Status:running}
	I1124 08:55:23.080125 1683796 cri.go:135] skipping {8a2c1ed065d22c98f59a6c1c4a6816d5635a1993d4b79170db5bdf930e6a5464 running}: state = "running", want "paused"
	I1124 08:55:23.080128 1683796 cri.go:129] container: {ID:abc7abcd5ce10cbacf352ea774b148d53af698f0d199f0022976bdac08da8136 Status:running}
	I1124 08:55:23.080132 1683796 cri.go:131] skipping abc7abcd5ce10cbacf352ea774b148d53af698f0d199f0022976bdac08da8136 - not in ps
	I1124 08:55:23.080135 1683796 cri.go:129] container: {ID:b1743c0f4c0960bc5012066e296265346c45ac42bb95c03d009712577c3c8885 Status:running}
	I1124 08:55:23.080139 1683796 cri.go:131] skipping b1743c0f4c0960bc5012066e296265346c45ac42bb95c03d009712577c3c8885 - not in ps
	I1124 08:55:23.080141 1683796 cri.go:129] container: {ID:bd056617869f45bc86f625d12c70a5bb57a80728d460178c634e8500f33caa6c Status:running}
	I1124 08:55:23.080144 1683796 cri.go:131] skipping bd056617869f45bc86f625d12c70a5bb57a80728d460178c634e8500f33caa6c - not in ps
	I1124 08:55:23.080146 1683796 cri.go:129] container: {ID:c59aa8a0dc911d866d1284907405697c4c3891d4b4c4423e9f764fbf37efa9c9 Status:running}
	I1124 08:55:23.080151 1683796 cri.go:135] skipping {c59aa8a0dc911d866d1284907405697c4c3891d4b4c4423e9f764fbf37efa9c9 running}: state = "running", want "paused"
	I1124 08:55:23.080153 1683796 cri.go:129] container: {ID:c67aa399f4b3099636ae5de0029e6f8778af29edd2cb80d538b3fce9f2752591 Status:running}
	I1124 08:55:23.080156 1683796 cri.go:135] skipping {c67aa399f4b3099636ae5de0029e6f8778af29edd2cb80d538b3fce9f2752591 running}: state = "running", want "paused"
	I1124 08:55:23.080160 1683796 cri.go:129] container: {ID:ff28a2e89f65b3323fbc0f41262a0b445c865c0474fbdac4ec03f66b94a6826a Status:running}
	I1124 08:55:23.080166 1683796 cri.go:135] skipping {ff28a2e89f65b3323fbc0f41262a0b445c865c0474fbdac4ec03f66b94a6826a running}: state = "running", want "paused"
	I1124 08:55:23.080218 1683796 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1124 08:55:23.088301 1683796 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1124 08:55:23.088314 1683796 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1124 08:55:23.088365 1683796 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1124 08:55:23.095812 1683796 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1124 08:55:23.096326 1683796 kubeconfig.go:125] found "functional-941011" server: "https://192.168.49.2:8441"
	I1124 08:55:23.097627 1683796 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1124 08:55:23.105692 1683796 kubeadm.go:645] detected kubeadm config drift (will reconfigure cluster from new /var/tmp/minikube/kubeadm.yaml):
	-- stdout --
	--- /var/tmp/minikube/kubeadm.yaml	2025-11-24 08:53:55.903044178 +0000
	+++ /var/tmp/minikube/kubeadm.yaml.new	2025-11-24 08:55:22.164109576 +0000
	@@ -24,7 +24,7 @@
	   certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	   extraArgs:
	     - name: "enable-admission-plugins"
	-      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	+      value: "NamespaceAutoProvision"
	 controllerManager:
	   extraArgs:
	     - name: "allocate-node-cidrs"
	
	-- /stdout --
	I1124 08:55:23.105701 1683796 kubeadm.go:1161] stopping kube-system containers ...
	I1124 08:55:23.105713 1683796 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name: Namespaces:[kube-system]}
	I1124 08:55:23.105778 1683796 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1124 08:55:23.136296 1683796 cri.go:89] found id: "c59aa8a0dc911d866d1284907405697c4c3891d4b4c4423e9f764fbf37efa9c9"
	I1124 08:55:23.136307 1683796 cri.go:89] found id: "05393cf0d0a6a4762c03e908fda17fafda4ed38bac67cdf553df9130af52fa37"
	I1124 08:55:23.136310 1683796 cri.go:89] found id: "ff28a2e89f65b3323fbc0f41262a0b445c865c0474fbdac4ec03f66b94a6826a"
	I1124 08:55:23.136314 1683796 cri.go:89] found id: "62155860688a446bc4f0edf281779e28f4ecf113640dbf9995722ce6bfe4cd55"
	I1124 08:55:23.136316 1683796 cri.go:89] found id: "39dbd124ad9131afd2e38d1cc6019bc4e02106f43653bf39b293c3e257811124"
	I1124 08:55:23.136318 1683796 cri.go:89] found id: "784ae9c9b61292b5cfcfba35d59469a6a91365290aa46304545499084ede8d7e"
	I1124 08:55:23.136320 1683796 cri.go:89] found id: "8a2c1ed065d22c98f59a6c1c4a6816d5635a1993d4b79170db5bdf930e6a5464"
	I1124 08:55:23.136322 1683796 cri.go:89] found id: "c67aa399f4b3099636ae5de0029e6f8778af29edd2cb80d538b3fce9f2752591"
	I1124 08:55:23.136324 1683796 cri.go:89] found id: ""
	I1124 08:55:23.136329 1683796 cri.go:252] Stopping containers: [c59aa8a0dc911d866d1284907405697c4c3891d4b4c4423e9f764fbf37efa9c9 05393cf0d0a6a4762c03e908fda17fafda4ed38bac67cdf553df9130af52fa37 ff28a2e89f65b3323fbc0f41262a0b445c865c0474fbdac4ec03f66b94a6826a 62155860688a446bc4f0edf281779e28f4ecf113640dbf9995722ce6bfe4cd55 39dbd124ad9131afd2e38d1cc6019bc4e02106f43653bf39b293c3e257811124 784ae9c9b61292b5cfcfba35d59469a6a91365290aa46304545499084ede8d7e 8a2c1ed065d22c98f59a6c1c4a6816d5635a1993d4b79170db5bdf930e6a5464 c67aa399f4b3099636ae5de0029e6f8778af29edd2cb80d538b3fce9f2752591]
	I1124 08:55:23.136385 1683796 ssh_runner.go:195] Run: which crictl
	I1124 08:55:23.140104 1683796 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl stop --timeout=10 c59aa8a0dc911d866d1284907405697c4c3891d4b4c4423e9f764fbf37efa9c9 05393cf0d0a6a4762c03e908fda17fafda4ed38bac67cdf553df9130af52fa37 ff28a2e89f65b3323fbc0f41262a0b445c865c0474fbdac4ec03f66b94a6826a 62155860688a446bc4f0edf281779e28f4ecf113640dbf9995722ce6bfe4cd55 39dbd124ad9131afd2e38d1cc6019bc4e02106f43653bf39b293c3e257811124 784ae9c9b61292b5cfcfba35d59469a6a91365290aa46304545499084ede8d7e 8a2c1ed065d22c98f59a6c1c4a6816d5635a1993d4b79170db5bdf930e6a5464 c67aa399f4b3099636ae5de0029e6f8778af29edd2cb80d538b3fce9f2752591
	I1124 08:55:38.740203 1683796 ssh_runner.go:235] Completed: sudo /usr/local/bin/crictl stop --timeout=10 c59aa8a0dc911d866d1284907405697c4c3891d4b4c4423e9f764fbf37efa9c9 05393cf0d0a6a4762c03e908fda17fafda4ed38bac67cdf553df9130af52fa37 ff28a2e89f65b3323fbc0f41262a0b445c865c0474fbdac4ec03f66b94a6826a 62155860688a446bc4f0edf281779e28f4ecf113640dbf9995722ce6bfe4cd55 39dbd124ad9131afd2e38d1cc6019bc4e02106f43653bf39b293c3e257811124 784ae9c9b61292b5cfcfba35d59469a6a91365290aa46304545499084ede8d7e 8a2c1ed065d22c98f59a6c1c4a6816d5635a1993d4b79170db5bdf930e6a5464 c67aa399f4b3099636ae5de0029e6f8778af29edd2cb80d538b3fce9f2752591: (15.600060197s)
	I1124 08:55:38.740279 1683796 ssh_runner.go:195] Run: sudo systemctl stop kubelet
	I1124 08:55:38.854449 1683796 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1124 08:55:38.862427 1683796 kubeadm.go:158] found existing configuration files:
	-rw------- 1 root root 5635 Nov 24 08:54 /etc/kubernetes/admin.conf
	-rw------- 1 root root 5636 Nov 24 08:54 /etc/kubernetes/controller-manager.conf
	-rw------- 1 root root 1972 Nov 24 08:54 /etc/kubernetes/kubelet.conf
	-rw------- 1 root root 5584 Nov 24 08:54 /etc/kubernetes/scheduler.conf
	
	I1124 08:55:38.862515 1683796 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1124 08:55:38.870471 1683796 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1124 08:55:38.877759 1683796 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1124 08:55:38.877814 1683796 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1124 08:55:38.885406 1683796 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1124 08:55:38.893318 1683796 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1124 08:55:38.893374 1683796 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1124 08:55:38.900695 1683796 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1124 08:55:38.908206 1683796 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1124 08:55:38.908263 1683796 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1124 08:55:38.915926 1683796 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1124 08:55:38.923670 1683796 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.34.2:$PATH" kubeadm init phase certs all --config /var/tmp/minikube/kubeadm.yaml"
	I1124 08:55:38.973745 1683796 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.34.2:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml"
	I1124 08:55:41.926679 1683796 ssh_runner.go:235] Completed: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.34.2:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml": (2.952907872s)
	I1124 08:55:41.926747 1683796 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.34.2:$PATH" kubeadm init phase kubelet-start --config /var/tmp/minikube/kubeadm.yaml"
	I1124 08:55:42.208569 1683796 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.34.2:$PATH" kubeadm init phase control-plane all --config /var/tmp/minikube/kubeadm.yaml"
	I1124 08:55:42.278820 1683796 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.34.2:$PATH" kubeadm init phase etcd local --config /var/tmp/minikube/kubeadm.yaml"
	I1124 08:55:42.349774 1683796 api_server.go:52] waiting for apiserver process to appear ...
	I1124 08:55:42.349846 1683796 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 08:55:42.850027 1683796 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 08:55:43.350436 1683796 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 08:55:43.365115 1683796 api_server.go:72] duration metric: took 1.015341327s to wait for apiserver process to appear ...
	I1124 08:55:43.365130 1683796 api_server.go:88] waiting for apiserver healthz status ...
	I1124 08:55:43.365149 1683796 api_server.go:253] Checking apiserver healthz at https://192.168.49.2:8441/healthz ...
	I1124 08:55:43.365510 1683796 api_server.go:269] stopped: https://192.168.49.2:8441/healthz: Get "https://192.168.49.2:8441/healthz": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 08:55:43.866007 1683796 api_server.go:253] Checking apiserver healthz at https://192.168.49.2:8441/healthz ...
	I1124 08:55:46.275142 1683796 api_server.go:279] https://192.168.49.2:8441/healthz returned 403:
	{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/healthz\"","reason":"Forbidden","details":{},"code":403}
	W1124 08:55:46.275161 1683796 api_server.go:103] status: https://192.168.49.2:8441/healthz returned error 403:
	{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/healthz\"","reason":"Forbidden","details":{},"code":403}
	I1124 08:55:46.275172 1683796 api_server.go:253] Checking apiserver healthz at https://192.168.49.2:8441/healthz ...
	I1124 08:55:46.513506 1683796 api_server.go:279] https://192.168.49.2:8441/healthz returned 403:
	{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/healthz\"","reason":"Forbidden","details":{},"code":403}
	W1124 08:55:46.513531 1683796 api_server.go:103] status: https://192.168.49.2:8441/healthz returned error 403:
	{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/healthz\"","reason":"Forbidden","details":{},"code":403}
	I1124 08:55:46.513544 1683796 api_server.go:253] Checking apiserver healthz at https://192.168.49.2:8441/healthz ...
	I1124 08:55:46.539416 1683796 api_server.go:279] https://192.168.49.2:8441/healthz returned 403:
	{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/healthz\"","reason":"Forbidden","details":{},"code":403}
	W1124 08:55:46.539437 1683796 api_server.go:103] status: https://192.168.49.2:8441/healthz returned error 403:
	{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/healthz\"","reason":"Forbidden","details":{},"code":403}
	I1124 08:55:46.865904 1683796 api_server.go:253] Checking apiserver healthz at https://192.168.49.2:8441/healthz ...
	I1124 08:55:46.874643 1683796 api_server.go:279] https://192.168.49.2:8441/healthz returned 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/start-system-namespaces-controller ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok
	[+]poststarthook/start-legacy-token-tracking-controller ok
	[+]poststarthook/start-service-ip-repair-controllers ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/bootstrap-controller ok
	[+]poststarthook/start-kubernetes-service-cidr-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-status-local-available-controller ok
	[+]poststarthook/apiservice-status-remote-available-controller ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-discovery-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	W1124 08:55:46.874660 1683796 api_server.go:103] status: https://192.168.49.2:8441/healthz returned error 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/start-system-namespaces-controller ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok
	[+]poststarthook/start-legacy-token-tracking-controller ok
	[+]poststarthook/start-service-ip-repair-controllers ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/bootstrap-controller ok
	[+]poststarthook/start-kubernetes-service-cidr-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-status-local-available-controller ok
	[+]poststarthook/apiservice-status-remote-available-controller ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-discovery-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	I1124 08:55:47.366174 1683796 api_server.go:253] Checking apiserver healthz at https://192.168.49.2:8441/healthz ...
	I1124 08:55:47.375673 1683796 api_server.go:279] https://192.168.49.2:8441/healthz returned 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/start-system-namespaces-controller ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok
	[+]poststarthook/start-legacy-token-tracking-controller ok
	[+]poststarthook/start-service-ip-repair-controllers ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/bootstrap-controller ok
	[+]poststarthook/start-kubernetes-service-cidr-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-status-local-available-controller ok
	[+]poststarthook/apiservice-status-remote-available-controller ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-discovery-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	W1124 08:55:47.375692 1683796 api_server.go:103] status: https://192.168.49.2:8441/healthz returned error 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/start-system-namespaces-controller ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok
	[+]poststarthook/start-legacy-token-tracking-controller ok
	[+]poststarthook/start-service-ip-repair-controllers ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/bootstrap-controller ok
	[+]poststarthook/start-kubernetes-service-cidr-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-status-local-available-controller ok
	[+]poststarthook/apiservice-status-remote-available-controller ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-discovery-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	I1124 08:55:47.866849 1683796 api_server.go:253] Checking apiserver healthz at https://192.168.49.2:8441/healthz ...
	I1124 08:55:47.874979 1683796 api_server.go:279] https://192.168.49.2:8441/healthz returned 200:
	ok
	I1124 08:55:47.889371 1683796 api_server.go:141] control plane version: v1.34.2
	I1124 08:55:47.889387 1683796 api_server.go:131] duration metric: took 4.524251682s to wait for apiserver health ...
	I1124 08:55:47.889396 1683796 cni.go:84] Creating CNI manager for ""
	I1124 08:55:47.889401 1683796 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1124 08:55:47.893396 1683796 out.go:179] * Configuring CNI (Container Networking Interface) ...
	I1124 08:55:47.896399 1683796 ssh_runner.go:195] Run: stat /opt/cni/bin/portmap
	I1124 08:55:47.900536 1683796 cni.go:182] applying CNI manifest using /var/lib/minikube/binaries/v1.34.2/kubectl ...
	I1124 08:55:47.900547 1683796 ssh_runner.go:362] scp memory --> /var/tmp/minikube/cni.yaml (2601 bytes)
	I1124 08:55:47.913525 1683796 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.2/kubectl apply --kubeconfig=/var/lib/minikube/kubeconfig -f /var/tmp/minikube/cni.yaml
	I1124 08:55:48.349446 1683796 system_pods.go:43] waiting for kube-system pods to appear ...
	I1124 08:55:48.353682 1683796 system_pods.go:59] 8 kube-system pods found
	I1124 08:55:48.353700 1683796 system_pods.go:61] "coredns-66bc5c9577-slkfz" [4ef50cca-2036-4f7d-b4e9-c814be440acd] Running
	I1124 08:55:48.353709 1683796 system_pods.go:61] "etcd-functional-941011" [74f9f006-4d5e-42c3-a258-2ad723af36a9] Running / Ready:ContainersNotReady (containers with unready status: [etcd]) / ContainersReady:ContainersNotReady (containers with unready status: [etcd])
	I1124 08:55:48.353712 1683796 system_pods.go:61] "kindnet-vsrrw" [3dccbc03-f381-4e6a-9c42-e221c48dc82b] Running
	I1124 08:55:48.353718 1683796 system_pods.go:61] "kube-apiserver-functional-941011" [e9f09ee1-a193-4009-bbe3-baee8265b3be] Running / Ready:ContainersNotReady (containers with unready status: [kube-apiserver]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-apiserver])
	I1124 08:55:48.353723 1683796 system_pods.go:61] "kube-controller-manager-functional-941011" [f25efdfd-640a-498b-bbb2-8a806727e15f] Running / Ready:ContainersNotReady (containers with unready status: [kube-controller-manager]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-controller-manager])
	I1124 08:55:48.353727 1683796 system_pods.go:61] "kube-proxy-kmdsq" [d98705fa-0c92-4b81-a03b-02c36c100f80] Running
	I1124 08:55:48.353731 1683796 system_pods.go:61] "kube-scheduler-functional-941011" [bca2c4b3-2af7-4bb6-9b46-493227dcfea4] Running / Ready:ContainersNotReady (containers with unready status: [kube-scheduler]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-scheduler])
	I1124 08:55:48.353745 1683796 system_pods.go:61] "storage-provisioner" [13c14b20-a1ad-4212-98a5-325e1115d97a] Running / Ready:ContainersNotReady (containers with unready status: [storage-provisioner]) / ContainersReady:ContainersNotReady (containers with unready status: [storage-provisioner])
	I1124 08:55:48.353750 1683796 system_pods.go:74] duration metric: took 4.294297ms to wait for pod list to return data ...
	I1124 08:55:48.353757 1683796 node_conditions.go:102] verifying NodePressure condition ...
	I1124 08:55:48.356760 1683796 node_conditions.go:122] node storage ephemeral capacity is 203034800Ki
	I1124 08:55:48.356780 1683796 node_conditions.go:123] node cpu capacity is 2
	I1124 08:55:48.356791 1683796 node_conditions.go:105] duration metric: took 3.030347ms to run NodePressure ...
	I1124 08:55:48.356851 1683796 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.34.2:$PATH" kubeadm init phase addon all --config /var/tmp/minikube/kubeadm.yaml"
	I1124 08:55:48.701909 1683796 kubeadm.go:729] waiting for restarted kubelet to initialise ...
	I1124 08:55:48.723023 1683796 kubeadm.go:744] kubelet initialised
	I1124 08:55:48.723045 1683796 kubeadm.go:745] duration metric: took 21.123168ms waiting for restarted kubelet to initialise ...
	I1124 08:55:48.723061 1683796 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I1124 08:55:48.741386 1683796 ops.go:34] apiserver oom_adj: -16
	I1124 08:55:48.741397 1683796 kubeadm.go:602] duration metric: took 25.653078507s to restartPrimaryControlPlane
	I1124 08:55:48.741406 1683796 kubeadm.go:403] duration metric: took 25.721388484s to StartCluster
	I1124 08:55:48.741420 1683796 settings.go:142] acquiring lock: {Name:mk6c04793f5fd4f38f92abf4357247f2ccd7fc4e Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 08:55:48.741497 1683796 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 08:55:48.742170 1683796 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/kubeconfig: {Name:mk02121ae6148bede61eabf0ed4e1826024715f4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 08:55:48.742414 1683796 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1124 08:55:48.742854 1683796 addons.go:527] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I1124 08:55:48.742926 1683796 addons.go:70] Setting storage-provisioner=true in profile "functional-941011"
	I1124 08:55:48.742939 1683796 addons.go:239] Setting addon storage-provisioner=true in "functional-941011"
	W1124 08:55:48.742943 1683796 addons.go:248] addon storage-provisioner should already be in state true
	I1124 08:55:48.742964 1683796 host.go:66] Checking if "functional-941011" exists ...
	I1124 08:55:48.743617 1683796 cli_runner.go:164] Run: docker container inspect functional-941011 --format={{.State.Status}}
	I1124 08:55:48.744038 1683796 config.go:182] Loaded profile config "functional-941011": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
	I1124 08:55:48.744161 1683796 addons.go:70] Setting default-storageclass=true in profile "functional-941011"
	I1124 08:55:48.744172 1683796 addons_storage_classes.go:34] enableOrDisableStorageClasses default-storageclass=true on "functional-941011"
	I1124 08:55:48.744496 1683796 cli_runner.go:164] Run: docker container inspect functional-941011 --format={{.State.Status}}
	I1124 08:55:48.747181 1683796 out.go:179] * Verifying Kubernetes components...
	I1124 08:55:48.750209 1683796 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1124 08:55:48.788206 1683796 addons.go:239] Setting addon default-storageclass=true in "functional-941011"
	W1124 08:55:48.788216 1683796 addons.go:248] addon default-storageclass should already be in state true
	I1124 08:55:48.788241 1683796 host.go:66] Checking if "functional-941011" exists ...
	I1124 08:55:48.788759 1683796 cli_runner.go:164] Run: docker container inspect functional-941011 --format={{.State.Status}}
	I1124 08:55:48.790563 1683796 out.go:179]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1124 08:55:48.795274 1683796 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 08:55:48.795286 1683796 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1124 08:55:48.795355 1683796 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-941011
	I1124 08:55:48.811196 1683796 addons.go:436] installing /etc/kubernetes/addons/storageclass.yaml
	I1124 08:55:48.811209 1683796 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1124 08:55:48.811287 1683796 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-941011
	I1124 08:55:48.816644 1683796 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34679 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-941011/id_rsa Username:docker}
	I1124 08:55:48.842660 1683796 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34679 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-941011/id_rsa Username:docker}
	I1124 08:55:48.997019 1683796 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 08:55:49.001001 1683796 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1124 08:55:49.025813 1683796 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.2/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1124 08:55:49.851141 1683796 node_ready.go:35] waiting up to 6m0s for node "functional-941011" to be "Ready" ...
	I1124 08:55:49.854823 1683796 node_ready.go:49] node "functional-941011" is "Ready"
	I1124 08:55:49.854838 1683796 node_ready.go:38] duration metric: took 3.680389ms for node "functional-941011" to be "Ready" ...
	I1124 08:55:49.854850 1683796 api_server.go:52] waiting for apiserver process to appear ...
	I1124 08:55:49.854910 1683796 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 08:55:49.862033 1683796 out.go:179] * Enabled addons: storage-provisioner, default-storageclass
	I1124 08:55:49.864851 1683796 addons.go:530] duration metric: took 1.121992708s for enable addons: enabled=[storage-provisioner default-storageclass]
	I1124 08:55:49.870042 1683796 api_server.go:72] duration metric: took 1.127603819s to wait for apiserver process to appear ...
	I1124 08:55:49.870061 1683796 api_server.go:88] waiting for apiserver healthz status ...
	I1124 08:55:49.870096 1683796 api_server.go:253] Checking apiserver healthz at https://192.168.49.2:8441/healthz ...
	I1124 08:55:49.884243 1683796 api_server.go:279] https://192.168.49.2:8441/healthz returned 200:
	ok
	I1124 08:55:49.885760 1683796 api_server.go:141] control plane version: v1.34.2
	I1124 08:55:49.885775 1683796 api_server.go:131] duration metric: took 15.709893ms to wait for apiserver health ...
	I1124 08:55:49.885784 1683796 system_pods.go:43] waiting for kube-system pods to appear ...
	I1124 08:55:49.889422 1683796 system_pods.go:59] 8 kube-system pods found
	I1124 08:55:49.889439 1683796 system_pods.go:61] "coredns-66bc5c9577-slkfz" [4ef50cca-2036-4f7d-b4e9-c814be440acd] Running
	I1124 08:55:49.889446 1683796 system_pods.go:61] "etcd-functional-941011" [74f9f006-4d5e-42c3-a258-2ad723af36a9] Running / Ready:ContainersNotReady (containers with unready status: [etcd]) / ContainersReady:ContainersNotReady (containers with unready status: [etcd])
	I1124 08:55:49.889449 1683796 system_pods.go:61] "kindnet-vsrrw" [3dccbc03-f381-4e6a-9c42-e221c48dc82b] Running
	I1124 08:55:49.889456 1683796 system_pods.go:61] "kube-apiserver-functional-941011" [e9f09ee1-a193-4009-bbe3-baee8265b3be] Running / Ready:ContainersNotReady (containers with unready status: [kube-apiserver]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-apiserver])
	I1124 08:55:49.889462 1683796 system_pods.go:61] "kube-controller-manager-functional-941011" [f25efdfd-640a-498b-bbb2-8a806727e15f] Running / Ready:ContainersNotReady (containers with unready status: [kube-controller-manager]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-controller-manager])
	I1124 08:55:49.889465 1683796 system_pods.go:61] "kube-proxy-kmdsq" [d98705fa-0c92-4b81-a03b-02c36c100f80] Running
	I1124 08:55:49.889470 1683796 system_pods.go:61] "kube-scheduler-functional-941011" [bca2c4b3-2af7-4bb6-9b46-493227dcfea4] Running / Ready:ContainersNotReady (containers with unready status: [kube-scheduler]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-scheduler])
	I1124 08:55:49.889473 1683796 system_pods.go:61] "storage-provisioner" [13c14b20-a1ad-4212-98a5-325e1115d97a] Running
	I1124 08:55:49.889478 1683796 system_pods.go:74] duration metric: took 3.689448ms to wait for pod list to return data ...
	I1124 08:55:49.889484 1683796 default_sa.go:34] waiting for default service account to be created ...
	I1124 08:55:49.895445 1683796 default_sa.go:45] found service account: "default"
	I1124 08:55:49.895459 1683796 default_sa.go:55] duration metric: took 5.969787ms for default service account to be created ...
	I1124 08:55:49.895467 1683796 system_pods.go:116] waiting for k8s-apps to be running ...
	I1124 08:55:49.899110 1683796 system_pods.go:86] 8 kube-system pods found
	I1124 08:55:49.899125 1683796 system_pods.go:89] "coredns-66bc5c9577-slkfz" [4ef50cca-2036-4f7d-b4e9-c814be440acd] Running
	I1124 08:55:49.899134 1683796 system_pods.go:89] "etcd-functional-941011" [74f9f006-4d5e-42c3-a258-2ad723af36a9] Running / Ready:ContainersNotReady (containers with unready status: [etcd]) / ContainersReady:ContainersNotReady (containers with unready status: [etcd])
	I1124 08:55:49.899138 1683796 system_pods.go:89] "kindnet-vsrrw" [3dccbc03-f381-4e6a-9c42-e221c48dc82b] Running
	I1124 08:55:49.899144 1683796 system_pods.go:89] "kube-apiserver-functional-941011" [e9f09ee1-a193-4009-bbe3-baee8265b3be] Running / Ready:ContainersNotReady (containers with unready status: [kube-apiserver]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-apiserver])
	I1124 08:55:49.899149 1683796 system_pods.go:89] "kube-controller-manager-functional-941011" [f25efdfd-640a-498b-bbb2-8a806727e15f] Running / Ready:ContainersNotReady (containers with unready status: [kube-controller-manager]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-controller-manager])
	I1124 08:55:49.899153 1683796 system_pods.go:89] "kube-proxy-kmdsq" [d98705fa-0c92-4b81-a03b-02c36c100f80] Running
	I1124 08:55:49.899157 1683796 system_pods.go:89] "kube-scheduler-functional-941011" [bca2c4b3-2af7-4bb6-9b46-493227dcfea4] Running / Ready:ContainersNotReady (containers with unready status: [kube-scheduler]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-scheduler])
	I1124 08:55:49.899162 1683796 system_pods.go:89] "storage-provisioner" [13c14b20-a1ad-4212-98a5-325e1115d97a] Running
	I1124 08:55:49.899168 1683796 system_pods.go:126] duration metric: took 3.696496ms to wait for k8s-apps to be running ...
	I1124 08:55:49.899174 1683796 system_svc.go:44] waiting for kubelet service to be running ....
	I1124 08:55:49.899231 1683796 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1124 08:55:49.923216 1683796 system_svc.go:56] duration metric: took 24.032693ms WaitForService to wait for kubelet
	I1124 08:55:49.923233 1683796 kubeadm.go:587] duration metric: took 1.180799278s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1124 08:55:49.923248 1683796 node_conditions.go:102] verifying NodePressure condition ...
	I1124 08:55:49.941255 1683796 node_conditions.go:122] node storage ephemeral capacity is 203034800Ki
	I1124 08:55:49.941272 1683796 node_conditions.go:123] node cpu capacity is 2
	I1124 08:55:49.941293 1683796 node_conditions.go:105] duration metric: took 18.030528ms to run NodePressure ...
	I1124 08:55:49.941305 1683796 start.go:242] waiting for startup goroutines ...
	I1124 08:55:49.941312 1683796 start.go:247] waiting for cluster config update ...
	I1124 08:55:49.941322 1683796 start.go:256] writing updated cluster config ...
	I1124 08:55:49.941620 1683796 ssh_runner.go:195] Run: rm -f paused
	I1124 08:55:49.945347 1683796 pod_ready.go:37] extra waiting up to 4m0s for all "kube-system" pods having one of [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] labels to be "Ready" ...
	I1124 08:55:49.954422 1683796 pod_ready.go:83] waiting for pod "coredns-66bc5c9577-slkfz" in "kube-system" namespace to be "Ready" or be gone ...
	I1124 08:55:49.991315 1683796 pod_ready.go:94] pod "coredns-66bc5c9577-slkfz" is "Ready"
	I1124 08:55:49.991335 1683796 pod_ready.go:86] duration metric: took 36.896693ms for pod "coredns-66bc5c9577-slkfz" in "kube-system" namespace to be "Ready" or be gone ...
	I1124 08:55:49.998403 1683796 pod_ready.go:83] waiting for pod "etcd-functional-941011" in "kube-system" namespace to be "Ready" or be gone ...
	W1124 08:55:52.005981 1683796 pod_ready.go:104] pod "etcd-functional-941011" is not "Ready", error: <nil>
	W1124 08:55:54.007072 1683796 pod_ready.go:104] pod "etcd-functional-941011" is not "Ready", error: <nil>
	W1124 08:55:56.008297 1683796 pod_ready.go:104] pod "etcd-functional-941011" is not "Ready", error: <nil>
	W1124 08:55:58.008845 1683796 pod_ready.go:104] pod "etcd-functional-941011" is not "Ready", error: <nil>
	W1124 08:56:00.104657 1683796 pod_ready.go:104] pod "etcd-functional-941011" is not "Ready", error: <nil>
	I1124 08:56:01.503549 1683796 pod_ready.go:94] pod "etcd-functional-941011" is "Ready"
	I1124 08:56:01.503573 1683796 pod_ready.go:86] duration metric: took 11.505155599s for pod "etcd-functional-941011" in "kube-system" namespace to be "Ready" or be gone ...
	I1124 08:56:01.506747 1683796 pod_ready.go:83] waiting for pod "kube-apiserver-functional-941011" in "kube-system" namespace to be "Ready" or be gone ...
	I1124 08:56:01.511915 1683796 pod_ready.go:94] pod "kube-apiserver-functional-941011" is "Ready"
	I1124 08:56:01.511941 1683796 pod_ready.go:86] duration metric: took 5.176637ms for pod "kube-apiserver-functional-941011" in "kube-system" namespace to be "Ready" or be gone ...
	I1124 08:56:01.514411 1683796 pod_ready.go:83] waiting for pod "kube-controller-manager-functional-941011" in "kube-system" namespace to be "Ready" or be gone ...
	I1124 08:56:01.519656 1683796 pod_ready.go:94] pod "kube-controller-manager-functional-941011" is "Ready"
	I1124 08:56:01.519672 1683796 pod_ready.go:86] duration metric: took 5.247497ms for pod "kube-controller-manager-functional-941011" in "kube-system" namespace to be "Ready" or be gone ...
	I1124 08:56:01.522371 1683796 pod_ready.go:83] waiting for pod "kube-proxy-kmdsq" in "kube-system" namespace to be "Ready" or be gone ...
	I1124 08:56:01.702585 1683796 pod_ready.go:94] pod "kube-proxy-kmdsq" is "Ready"
	I1124 08:56:01.702605 1683796 pod_ready.go:86] duration metric: took 180.219267ms for pod "kube-proxy-kmdsq" in "kube-system" namespace to be "Ready" or be gone ...
	I1124 08:56:01.901839 1683796 pod_ready.go:83] waiting for pod "kube-scheduler-functional-941011" in "kube-system" namespace to be "Ready" or be gone ...
	W1124 08:56:03.907714 1683796 pod_ready.go:104] pod "kube-scheduler-functional-941011" is not "Ready", error: <nil>
	W1124 08:56:06.407320 1683796 pod_ready.go:104] pod "kube-scheduler-functional-941011" is not "Ready", error: <nil>
	W1124 08:56:08.408070 1683796 pod_ready.go:104] pod "kube-scheduler-functional-941011" is not "Ready", error: <nil>
	W1124 08:56:10.906515 1683796 pod_ready.go:104] pod "kube-scheduler-functional-941011" is not "Ready", error: <nil>
	I1124 08:56:11.907622 1683796 pod_ready.go:94] pod "kube-scheduler-functional-941011" is "Ready"
	I1124 08:56:11.907637 1683796 pod_ready.go:86] duration metric: took 10.005785693s for pod "kube-scheduler-functional-941011" in "kube-system" namespace to be "Ready" or be gone ...
	I1124 08:56:11.907648 1683796 pod_ready.go:40] duration metric: took 21.962279749s for extra waiting for all "kube-system" pods having one of [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] labels to be "Ready" ...
	I1124 08:56:11.967301 1683796 start.go:625] kubectl: 1.33.2, cluster: 1.34.2 (minor skew: 1)
	I1124 08:56:11.970560 1683796 out.go:179] * Done! kubectl is now configured to use "functional-941011" cluster and "default" namespace by default
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                      ATTEMPT             POD ID              POD                                         NAMESPACE
	85c0699fd2b2d       ce2d2cda2d858       4 minutes ago       Running             echo-server               0                   8f38d234a4a08       hello-node-75c85bcc94-srggf                 default
	e068a2929b9a7       ba04bb24b9575       4 minutes ago       Running             storage-provisioner       2                   1d6d04a32577c       storage-provisioner                         kube-system
	12cb72b6be32a       b178af3d91f80       4 minutes ago       Running             kube-apiserver            0                   43e43c411fcfa       kube-apiserver-functional-941011            kube-system
	bb12959193539       1b34917560f09       4 minutes ago       Running             kube-controller-manager   1                   5f4eb46413316       kube-controller-manager-functional-941011   kube-system
	f631f089f7db7       2c5f0dedd21c2       4 minutes ago       Running             etcd                      1                   abc7abcd5ce10       etcd-functional-941011                      kube-system
	4db14bc0f3747       4f982e73e768a       5 minutes ago       Running             kube-scheduler            1                   0fac3868adf86       kube-scheduler-functional-941011            kube-system
	b6a84160fb5a4       ba04bb24b9575       5 minutes ago       Exited              storage-provisioner       1                   1d6d04a32577c       storage-provisioner                         kube-system
	fc6fc133ebd24       138784d87c9c5       5 minutes ago       Running             coredns                   1                   b1743c0f4c096       coredns-66bc5c9577-slkfz                    kube-system
	73cbc10088bae       94bff1bec29fd       5 minutes ago       Running             kube-proxy                1                   31e96b363c821       kube-proxy-kmdsq                            kube-system
	c6aa1ae3a9097       b1a8c6f707935       5 minutes ago       Running             kindnet-cni               1                   63a1adeff106f       kindnet-vsrrw                               kube-system
	c59aa8a0dc911       138784d87c9c5       5 minutes ago       Exited              coredns                   0                   b1743c0f4c096       coredns-66bc5c9577-slkfz                    kube-system
	ff28a2e89f65b       b1a8c6f707935       6 minutes ago       Exited              kindnet-cni               0                   63a1adeff106f       kindnet-vsrrw                               kube-system
	62155860688a4       94bff1bec29fd       6 minutes ago       Exited              kube-proxy                0                   31e96b363c821       kube-proxy-kmdsq                            kube-system
	39dbd124ad913       4f982e73e768a       6 minutes ago       Exited              kube-scheduler            0                   0fac3868adf86       kube-scheduler-functional-941011            kube-system
	8a2c1ed065d22       1b34917560f09       6 minutes ago       Exited              kube-controller-manager   0                   5f4eb46413316       kube-controller-manager-functional-941011   kube-system
	c67aa399f4b30       2c5f0dedd21c2       6 minutes ago       Exited              etcd                      0                   abc7abcd5ce10       etcd-functional-941011                      kube-system
	
	
	==> containerd <==
	Nov 24 08:59:22 functional-941011 containerd[3558]: time="2025-11-24T08:59:22.878489935Z" level=info msg="stop pulling image docker.io/library/nginx:alpine: active requests=0, bytes read=21300"
	Nov 24 08:59:22 functional-941011 containerd[3558]: time="2025-11-24T08:59:22.878495441Z" level=error msg="PullImage \"docker.io/nginx:alpine\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"docker.io/library/nginx:alpine\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:7391b3732e7f7ccd23ff1d02fbeadcde496f374d7460ad8e79260f8f6d2c9f90: 429 Too Many Requests\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit"
	Nov 24 08:59:37 functional-941011 containerd[3558]: time="2025-11-24T08:59:37.370667714Z" level=info msg="PullImage \"docker.io/nginx:latest\""
	Nov 24 08:59:37 functional-941011 containerd[3558]: time="2025-11-24T08:59:37.801328640Z" level=error msg="PullImage \"docker.io/nginx:latest\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"docker.io/library/nginx:latest\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:553f64aecdc31b5bf944521731cd70e35da4faed96b2b7548a3d8e2598c52a42: 429 Too Many Requests\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit"
	Nov 24 08:59:37 functional-941011 containerd[3558]: time="2025-11-24T08:59:37.801356226Z" level=info msg="stop pulling image docker.io/library/nginx:latest: active requests=0, bytes read=10967"
	Nov 24 09:00:28 functional-941011 containerd[3558]: time="2025-11-24T09:00:28.238257407Z" level=info msg="container event discarded" container=c59aa8a0dc911d866d1284907405697c4c3891d4b4c4423e9f764fbf37efa9c9 type=CONTAINER_STOPPED_EVENT
	Nov 24 09:00:28 functional-941011 containerd[3558]: time="2025-11-24T09:00:28.298535922Z" level=info msg="container event discarded" container=05393cf0d0a6a4762c03e908fda17fafda4ed38bac67cdf553df9130af52fa37 type=CONTAINER_STOPPED_EVENT
	Nov 24 09:00:28 functional-941011 containerd[3558]: time="2025-11-24T09:00:28.361886841Z" level=info msg="container event discarded" container=ff28a2e89f65b3323fbc0f41262a0b445c865c0474fbdac4ec03f66b94a6826a type=CONTAINER_STOPPED_EVENT
	Nov 24 09:00:28 functional-941011 containerd[3558]: time="2025-11-24T09:00:28.430368133Z" level=info msg="container event discarded" container=62155860688a446bc4f0edf281779e28f4ecf113640dbf9995722ce6bfe4cd55 type=CONTAINER_STOPPED_EVENT
	Nov 24 09:00:28 functional-941011 containerd[3558]: time="2025-11-24T09:00:28.488832424Z" level=info msg="container event discarded" container=39dbd124ad9131afd2e38d1cc6019bc4e02106f43653bf39b293c3e257811124 type=CONTAINER_STOPPED_EVENT
	Nov 24 09:00:28 functional-941011 containerd[3558]: time="2025-11-24T09:00:28.824769722Z" level=info msg="container event discarded" container=c6aa1ae3a9097decf5035a00445a3b904d34d773c5e3cbb839ca84c4b31b2ba7 type=CONTAINER_CREATED_EVENT
	Nov 24 09:00:28 functional-941011 containerd[3558]: time="2025-11-24T09:00:28.839063572Z" level=info msg="container event discarded" container=73cbc10088baee047f5af65dd9d3da14101d5c79193d1084620f4ef78567007c type=CONTAINER_CREATED_EVENT
	Nov 24 09:00:28 functional-941011 containerd[3558]: time="2025-11-24T09:00:28.855350224Z" level=info msg="container event discarded" container=fc6fc133ebd24da2f7fbde321f7cbdf8f57ce1d20c8e70b56a4530409fe91cf6 type=CONTAINER_CREATED_EVENT
	Nov 24 09:00:28 functional-941011 containerd[3558]: time="2025-11-24T09:00:28.855402294Z" level=info msg="container event discarded" container=b6a84160fb5a462413dc19bd1857be8f2613f26a584ad28f2b8052ac97712585 type=CONTAINER_CREATED_EVENT
	Nov 24 09:00:28 functional-941011 containerd[3558]: time="2025-11-24T09:00:28.874642965Z" level=info msg="container event discarded" container=4db14bc0f3747bb08ccdf3a4232220b42a95e683884234f9fff57a09e95b88c3 type=CONTAINER_CREATED_EVENT
	Nov 24 09:00:29 functional-941011 containerd[3558]: time="2025-11-24T09:00:29.018357355Z" level=info msg="container event discarded" container=4db14bc0f3747bb08ccdf3a4232220b42a95e683884234f9fff57a09e95b88c3 type=CONTAINER_STARTED_EVENT
	Nov 24 09:00:29 functional-941011 containerd[3558]: time="2025-11-24T09:00:29.051990339Z" level=info msg="container event discarded" container=fc6fc133ebd24da2f7fbde321f7cbdf8f57ce1d20c8e70b56a4530409fe91cf6 type=CONTAINER_STARTED_EVENT
	Nov 24 09:00:29 functional-941011 containerd[3558]: time="2025-11-24T09:00:29.078293327Z" level=info msg="container event discarded" container=c6aa1ae3a9097decf5035a00445a3b904d34d773c5e3cbb839ca84c4b31b2ba7 type=CONTAINER_STARTED_EVENT
	Nov 24 09:00:29 functional-941011 containerd[3558]: time="2025-11-24T09:00:29.094508142Z" level=info msg="container event discarded" container=b6a84160fb5a462413dc19bd1857be8f2613f26a584ad28f2b8052ac97712585 type=CONTAINER_STARTED_EVENT
	Nov 24 09:00:29 functional-941011 containerd[3558]: time="2025-11-24T09:00:29.180810265Z" level=info msg="container event discarded" container=73cbc10088baee047f5af65dd9d3da14101d5c79193d1084620f4ef78567007c type=CONTAINER_STARTED_EVENT
	Nov 24 09:00:29 functional-941011 containerd[3558]: time="2025-11-24T09:00:29.180871402Z" level=info msg="container event discarded" container=b6a84160fb5a462413dc19bd1857be8f2613f26a584ad28f2b8052ac97712585 type=CONTAINER_STOPPED_EVENT
	Nov 24 09:00:29 functional-941011 containerd[3558]: time="2025-11-24T09:00:29.817411242Z" level=info msg="container event discarded" container=05393cf0d0a6a4762c03e908fda17fafda4ed38bac67cdf553df9130af52fa37 type=CONTAINER_DELETED_EVENT
	Nov 24 09:00:38 functional-941011 containerd[3558]: time="2025-11-24T09:00:38.580479275Z" level=info msg="container event discarded" container=784ae9c9b61292b5cfcfba35d59469a6a91365290aa46304545499084ede8d7e type=CONTAINER_STOPPED_EVENT
	Nov 24 09:00:38 functional-941011 containerd[3558]: time="2025-11-24T09:00:38.681844083Z" level=info msg="container event discarded" container=8a2c1ed065d22c98f59a6c1c4a6816d5635a1993d4b79170db5bdf930e6a5464 type=CONTAINER_STOPPED_EVENT
	Nov 24 09:00:38 functional-941011 containerd[3558]: time="2025-11-24T09:00:38.747114966Z" level=info msg="container event discarded" container=c67aa399f4b3099636ae5de0029e6f8778af29edd2cb80d538b3fce9f2752591 type=CONTAINER_STOPPED_EVENT
	
	
	==> coredns [c59aa8a0dc911d866d1284907405697c4c3891d4b4c4423e9f764fbf37efa9c9] <==
	maxprocs: Leaving GOMAXPROCS=2: CPU quota undefined
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 9e2996f8cb67ac53e0259ab1f8d615d07d1beb0bd07e6a1e39769c3bf486a905bb991cc47f8d2f14d0d3a90a87dfc625a0b4c524fed169d8158c40657c0694b1
	CoreDNS-1.12.1
	linux/arm64, go1.24.1, 707c7c1
	[INFO] 127.0.0.1:51507 - 53970 "HINFO IN 1655404113277552318.700530042625674888. udp 56 false 512" NXDOMAIN qr,rd,ra 56 0.01547365s
	[INFO] SIGTERM: Shutting down servers then terminating
	[INFO] plugin/health: Going into lameduck mode for 5s
	
	
	==> coredns [fc6fc133ebd24da2f7fbde321f7cbdf8f57ce1d20c8e70b56a4530409fe91cf6] <==
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 9e2996f8cb67ac53e0259ab1f8d615d07d1beb0bd07e6a1e39769c3bf486a905bb991cc47f8d2f14d0d3a90a87dfc625a0b4c524fed169d8158c40657c0694b1
	CoreDNS-1.12.1
	linux/arm64, go1.24.1, 707c7c1
	[INFO] 127.0.0.1:44127 - 22213 "HINFO IN 7327540771892768101.5397131767721917287. udp 57 false 512" NXDOMAIN qr,rd,ra 57 0.043158014s
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: connect: connection refused
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Service: services is forbidden: User "system:serviceaccount:kube-system:coredns" cannot list resource "services" in API group "" at the cluster scope
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.32.3/tools/cache/reflector.go:251: failed to list *v1.Namespace: namespaces is forbidden: User "system:serviceaccount:kube-system:coredns" cannot list resource "namespaces" in API group "" at the cluster scope
	[ERROR] plugin/kubernetes: Unhandled Error
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	
	
	==> describe nodes <==
	Name:               functional-941011
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=arm64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=arm64
	                    kubernetes.io/hostname=functional-941011
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=393ee3e0b845623107dce6cda4f48ffd5c3d1811
	                    minikube.k8s.io/name=functional-941011
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2025_11_24T08_54_12_0700
	                    minikube.k8s.io/version=v1.37.0
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Mon, 24 Nov 2025 08:54:08 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  functional-941011
	  AcquireTime:     <unset>
	  RenewTime:       Mon, 24 Nov 2025 09:00:40 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Mon, 24 Nov 2025 08:56:47 +0000   Mon, 24 Nov 2025 08:54:06 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Mon, 24 Nov 2025 08:56:47 +0000   Mon, 24 Nov 2025 08:54:06 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Mon, 24 Nov 2025 08:56:47 +0000   Mon, 24 Nov 2025 08:54:06 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Mon, 24 Nov 2025 08:56:47 +0000   Mon, 24 Nov 2025 08:54:58 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.49.2
	  Hostname:    functional-941011
	Capacity:
	  cpu:                2
	  ephemeral-storage:  203034800Ki
	  hugepages-1Gi:      0
	  hugepages-2Mi:      0
	  hugepages-32Mi:     0
	  hugepages-64Ki:     0
	  memory:             8022296Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  203034800Ki
	  hugepages-1Gi:      0
	  hugepages-2Mi:      0
	  hugepages-32Mi:     0
	  hugepages-64Ki:     0
	  memory:             8022296Ki
	  pods:               110
	System Info:
	  Machine ID:                 7283ea1857f18f20a875c29069214c9d
	  System UUID:                d38b29f6-9a27-498f-a371-693f2677a0b6
	  Boot ID:                    e6ca431c-3a35-478f-87f6-f49cc4bc8a65
	  Kernel Version:             5.15.0-1084-aws
	  OS Image:                   Debian GNU/Linux 12 (bookworm)
	  Operating System:           linux
	  Architecture:               arm64
	  Container Runtime Version:  containerd://2.1.5
	  Kubelet Version:            v1.34.2
	  Kube-Proxy Version:         
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (11 in total)
	  Namespace                   Name                                         CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                         ------------  ----------  ---------------  -------------  ---
	  default                     hello-node-75c85bcc94-srggf                  0 (0%)        0 (0%)      0 (0%)           0 (0%)         4m18s
	  default                     nginx-svc                                    0 (0%)        0 (0%)      0 (0%)           0 (0%)         4m13s
	  default                     sp-pod                                       0 (0%)        0 (0%)      0 (0%)           0 (0%)         4m1s
	  kube-system                 coredns-66bc5c9577-slkfz                     100m (5%)     0 (0%)      70Mi (0%)        170Mi (2%)     6m25s
	  kube-system                 etcd-functional-941011                       100m (5%)     0 (0%)      100Mi (1%)       0 (0%)         6m31s
	  kube-system                 kindnet-vsrrw                                100m (5%)     100m (5%)   50Mi (0%)        50Mi (0%)      6m26s
	  kube-system                 kube-apiserver-functional-941011             250m (12%)    0 (0%)      0 (0%)           0 (0%)         4m55s
	  kube-system                 kube-controller-manager-functional-941011    200m (10%)    0 (0%)      0 (0%)           0 (0%)         6m31s
	  kube-system                 kube-proxy-kmdsq                             0 (0%)        0 (0%)      0 (0%)           0 (0%)         6m26s
	  kube-system                 kube-scheduler-functional-941011             100m (5%)     0 (0%)      0 (0%)           0 (0%)         6m31s
	  kube-system                 storage-provisioner                          0 (0%)        0 (0%)      0 (0%)           0 (0%)         6m24s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                850m (42%)  100m (5%)
	  memory             220Mi (2%)  220Mi (2%)
	  ephemeral-storage  0 (0%)      0 (0%)
	  hugepages-1Gi      0 (0%)      0 (0%)
	  hugepages-2Mi      0 (0%)      0 (0%)
	  hugepages-32Mi     0 (0%)      0 (0%)
	  hugepages-64Ki     0 (0%)      0 (0%)
	Events:
	  Type     Reason                   Age              From             Message
	  ----     ------                   ----             ----             -------
	  Normal   Starting                 6m25s            kube-proxy       
	  Normal   Starting                 4m56s            kube-proxy       
	  Normal   NodeAllocatableEnforced  6m31s            kubelet          Updated Node Allocatable limit across pods
	  Warning  CgroupV1                 6m31s            kubelet          cgroup v1 support is in maintenance mode, please migrate to cgroup v2
	  Normal   NodeHasSufficientMemory  6m31s            kubelet          Node functional-941011 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    6m31s            kubelet          Node functional-941011 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     6m31s            kubelet          Node functional-941011 status is now: NodeHasSufficientPID
	  Normal   Starting                 6m31s            kubelet          Starting kubelet.
	  Normal   RegisteredNode           6m27s            node-controller  Node functional-941011 event: Registered Node functional-941011 in Controller
	  Normal   NodeReady                5m44s            kubelet          Node functional-941011 status is now: NodeReady
	  Normal   Starting                 5m               kubelet          Starting kubelet.
	  Warning  CgroupV1                 5m               kubelet          cgroup v1 support is in maintenance mode, please migrate to cgroup v2
	  Normal   NodeHasSufficientMemory  5m (x8 over 5m)  kubelet          Node functional-941011 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    5m (x8 over 5m)  kubelet          Node functional-941011 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     5m (x7 over 5m)  kubelet          Node functional-941011 status is now: NodeHasSufficientPID
	  Normal   NodeAllocatableEnforced  5m               kubelet          Updated Node Allocatable limit across pods
	  Normal   RegisteredNode           4m53s            node-controller  Node functional-941011 event: Registered Node functional-941011 in Controller
	
	
	==> dmesg <==
	[Nov24 08:20] overlayfs: idmapped layers are currently not supported
	[Nov24 08:21] overlayfs: idmapped layers are currently not supported
	[Nov24 08:22] overlayfs: idmapped layers are currently not supported
	[Nov24 08:23] overlayfs: idmapped layers are currently not supported
	[Nov24 08:24] overlayfs: idmapped layers are currently not supported
	[ +19.566509] overlayfs: idmapped layers are currently not supported
	[Nov24 08:25] overlayfs: idmapped layers are currently not supported
	[ +35.934095] overlayfs: idmapped layers are currently not supported
	[Nov24 08:27] overlayfs: idmapped layers are currently not supported
	[Nov24 08:28] overlayfs: idmapped layers are currently not supported
	[Nov24 08:29] overlayfs: idmapped layers are currently not supported
	[Nov24 08:30] overlayfs: idmapped layers are currently not supported
	[Nov24 08:31] overlayfs: idmapped layers are currently not supported
	[ +44.505180] overlayfs: idmapped layers are currently not supported
	[Nov24 08:32] overlayfs: idmapped layers are currently not supported
	[Nov24 08:33] overlayfs: idmapped layers are currently not supported
	[Nov24 08:34] overlayfs: idmapped layers are currently not supported
	[ +21.137877] overlayfs: idmapped layers are currently not supported
	[ +28.724175] overlayfs: idmapped layers are currently not supported
	[Nov24 08:36] overlayfs: idmapped layers are currently not supported
	[Nov24 08:37] overlayfs: idmapped layers are currently not supported
	[Nov24 08:38] overlayfs: idmapped layers are currently not supported
	[  +4.063834] overlayfs: idmapped layers are currently not supported
	[Nov24 08:40] overlayfs: idmapped layers are currently not supported
	[Nov24 08:42] kauditd_printk_skb: 8 callbacks suppressed
	
	
	==> etcd [c67aa399f4b3099636ae5de0029e6f8778af29edd2cb80d538b3fce9f2752591] <==
	{"level":"warn","ts":"2025-11-24T08:54:08.005514Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:39950","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:54:08.022010Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:39954","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:54:08.048197Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:39980","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:54:08.061624Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:39986","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:54:08.083337Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:39998","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:54:08.099353Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40006","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:54:08.164536Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40022","server-name":"","error":"EOF"}
	{"level":"info","ts":"2025-11-24T08:55:38.679574Z","caller":"osutil/interrupt_unix.go:65","msg":"received signal; shutting down","signal":"terminated"}
	{"level":"info","ts":"2025-11-24T08:55:38.679640Z","caller":"embed/etcd.go:426","msg":"closing etcd server","name":"functional-941011","data-dir":"/var/lib/minikube/etcd","advertise-peer-urls":["https://192.168.49.2:2380"],"advertise-client-urls":["https://192.168.49.2:2379"]}
	{"level":"error","ts":"2025-11-24T08:55:38.679751Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"http: Server closed","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*serveCtx).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/serve.go:90"}
	{"level":"error","ts":"2025-11-24T08:55:38.681678Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"http: Server closed","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*serveCtx).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/serve.go:90"}
	{"level":"warn","ts":"2025-11-24T08:55:38.681817Z","caller":"embed/serve.go:245","msg":"stopping secure grpc server due to error","error":"accept tcp 127.0.0.1:2379: use of closed network connection"}
	{"level":"warn","ts":"2025-11-24T08:55:38.681861Z","caller":"embed/serve.go:247","msg":"stopped secure grpc server due to error","error":"accept tcp 127.0.0.1:2379: use of closed network connection"}
	{"level":"error","ts":"2025-11-24T08:55:38.681870Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"accept tcp 127.0.0.1:2379: use of closed network connection","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*Etcd).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:906"}
	{"level":"warn","ts":"2025-11-24T08:55:38.681935Z","caller":"embed/serve.go:245","msg":"stopping secure grpc server due to error","error":"accept tcp 192.168.49.2:2379: use of closed network connection"}
	{"level":"warn","ts":"2025-11-24T08:55:38.681953Z","caller":"embed/serve.go:247","msg":"stopped secure grpc server due to error","error":"accept tcp 192.168.49.2:2379: use of closed network connection"}
	{"level":"error","ts":"2025-11-24T08:55:38.681992Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"accept tcp 127.0.0.1:2381: use of closed network connection","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*Etcd).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:906"}
	{"level":"info","ts":"2025-11-24T08:55:38.682013Z","caller":"etcdserver/server.go:1297","msg":"skipped leadership transfer for single voting member cluster","local-member-id":"aec36adc501070cc","current-leader-member-id":"aec36adc501070cc"}
	{"level":"error","ts":"2025-11-24T08:55:38.681960Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"accept tcp 192.168.49.2:2379: use of closed network connection","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*Etcd).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:906"}
	{"level":"info","ts":"2025-11-24T08:55:38.682068Z","caller":"etcdserver/server.go:2358","msg":"server has stopped; stopping storage version's monitor"}
	{"level":"info","ts":"2025-11-24T08:55:38.682078Z","caller":"etcdserver/server.go:2335","msg":"server has stopped; stopping cluster version's monitor"}
	{"level":"info","ts":"2025-11-24T08:55:38.685220Z","caller":"embed/etcd.go:621","msg":"stopping serving peer traffic","address":"192.168.49.2:2380"}
	{"level":"error","ts":"2025-11-24T08:55:38.685308Z","caller":"embed/etcd.go:912","msg":"setting up serving from embedded etcd failed.","error":"accept tcp 192.168.49.2:2380: use of closed network connection","stacktrace":"go.etcd.io/etcd/server/v3/embed.(*Etcd).errHandler\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:912\ngo.etcd.io/etcd/server/v3/embed.(*Etcd).startHandler.func1\n\tgo.etcd.io/etcd/server/v3/embed/etcd.go:906"}
	{"level":"info","ts":"2025-11-24T08:55:38.685331Z","caller":"embed/etcd.go:626","msg":"stopped serving peer traffic","address":"192.168.49.2:2380"}
	{"level":"info","ts":"2025-11-24T08:55:38.685338Z","caller":"embed/etcd.go:428","msg":"closed etcd server","name":"functional-941011","data-dir":"/var/lib/minikube/etcd","advertise-peer-urls":["https://192.168.49.2:2380"],"advertise-client-urls":["https://192.168.49.2:2379"]}
	
	
	==> etcd [f631f089f7db7469dd50cb6bbc0f39ec97f716fd16e1e26e288c4caf9028b204] <==
	{"level":"warn","ts":"2025-11-24T08:55:45.185740Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40244","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.206944Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40264","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.242806Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40280","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.264118Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40308","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.282780Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40330","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.297953Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40346","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.320828Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40364","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.336722Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40388","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.352974Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40418","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.369743Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40432","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.407571Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40450","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.417153Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40464","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.433463Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40474","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.452186Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40486","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.470636Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40502","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.487399Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40526","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.499700Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40542","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.515500Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40556","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.533300Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40572","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.550559Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40584","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.566082Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40608","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.588289Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40630","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.599854Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40638","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.614758Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40662","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-11-24T08:55:45.676226Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:40674","server-name":"","error":"EOF"}
	
	
	==> kernel <==
	 09:00:43 up  7:42,  0 user,  load average: 0.23, 0.74, 1.53
	Linux functional-941011 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kindnet [c6aa1ae3a9097decf5035a00445a3b904d34d773c5e3cbb839ca84c4b31b2ba7] <==
	I1124 08:58:39.321827       1 main.go:301] handling current node
	I1124 08:58:49.322308       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 08:58:49.322344       1 main.go:301] handling current node
	I1124 08:58:59.328406       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 08:58:59.328444       1 main.go:301] handling current node
	I1124 08:59:09.324985       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 08:59:09.325094       1 main.go:301] handling current node
	I1124 08:59:19.322402       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 08:59:19.322436       1 main.go:301] handling current node
	I1124 08:59:29.330508       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 08:59:29.330736       1 main.go:301] handling current node
	I1124 08:59:39.323582       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 08:59:39.323615       1 main.go:301] handling current node
	I1124 08:59:49.322221       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 08:59:49.322261       1 main.go:301] handling current node
	I1124 08:59:59.325123       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 08:59:59.325214       1 main.go:301] handling current node
	I1124 09:00:09.321618       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 09:00:09.321655       1 main.go:301] handling current node
	I1124 09:00:19.322342       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 09:00:19.322377       1 main.go:301] handling current node
	I1124 09:00:29.328310       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 09:00:29.328520       1 main.go:301] handling current node
	I1124 09:00:39.321914       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 09:00:39.321950       1 main.go:301] handling current node
	
	
	==> kindnet [ff28a2e89f65b3323fbc0f41262a0b445c865c0474fbdac4ec03f66b94a6826a] <==
	I1124 08:54:17.914898       1 main.go:139] hostIP = 192.168.49.2
	podIP = 192.168.49.2
	I1124 08:54:17.915083       1 main.go:148] setting mtu 1500 for CNI 
	I1124 08:54:17.915097       1 main.go:178] kindnetd IP family: "ipv4"
	I1124 08:54:17.915113       1 main.go:182] noMask IPv4 subnets: [10.244.0.0/16]
	time="2025-11-24T08:54:18Z" level=info msg="Created plugin 10-kube-network-policies (kindnetd, handles RunPodSandbox,RemovePodSandbox)"
	I1124 08:54:18.119698       1 controller.go:377] "Starting controller" name="kube-network-policies"
	I1124 08:54:18.119719       1 controller.go:381] "Waiting for informer caches to sync"
	I1124 08:54:18.119728       1 shared_informer.go:350] "Waiting for caches to sync" controller="kube-network-policies"
	I1124 08:54:18.119873       1 controller.go:390] nri plugin exited: failed to connect to NRI service: dial unix /var/run/nri/nri.sock: connect: no such file or directory
	E1124 08:54:48.120234       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.96.0.1:443/api/v1/nodes?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: i/o timeout" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Node"
	E1124 08:54:48.120236       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: i/o timeout" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Namespace"
	E1124 08:54:48.120465       1 reflector.go:200] "Failed to watch" err="failed to list *v1.NetworkPolicy: Get \"https://10.96.0.1:443/apis/networking.k8s.io/v1/networkpolicies?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: i/o timeout" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.NetworkPolicy"
	E1124 08:54:48.120608       1 reflector.go:200] "Failed to watch" err="failed to list *v1.Pod: Get \"https://10.96.0.1:443/api/v1/pods?limit=500&resourceVersion=0\": dial tcp 10.96.0.1:443: i/o timeout" logger="UnhandledError" reflector="pkg/mod/k8s.io/client-go@v0.33.0/tools/cache/reflector.go:285" type="*v1.Pod"
	I1124 08:54:49.719897       1 shared_informer.go:357] "Caches are synced" controller="kube-network-policies"
	I1124 08:54:49.719946       1 metrics.go:72] Registering metrics
	I1124 08:54:49.720169       1 controller.go:711] "Syncing nftables rules"
	I1124 08:54:58.119604       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 08:54:58.119873       1 main.go:301] handling current node
	I1124 08:55:08.122765       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 08:55:08.123311       1 main.go:301] handling current node
	I1124 08:55:18.122591       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 08:55:18.122626       1 main.go:301] handling current node
	I1124 08:55:28.122561       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1124 08:55:28.122829       1 main.go:301] handling current node
	
	
	==> kube-apiserver [12cb72b6be32a86f06dc22816a54fbdc70bf5efe54d3eba2e90672e835fad88f] <==
	I1124 08:55:46.620661       1 handler_discovery.go:451] Starting ResourceDiscoveryManager
	I1124 08:55:46.624260       1 shared_informer.go:356] "Caches are synced" controller="crd-autoregister"
	I1124 08:55:46.624312       1 aggregator.go:171] initial CRD sync complete...
	I1124 08:55:46.624323       1 autoregister_controller.go:144] Starting autoregister controller
	I1124 08:55:46.624330       1 cache.go:32] Waiting for caches to sync for autoregister controller
	I1124 08:55:46.624336       1 cache.go:39] Caches are synced for autoregister controller
	I1124 08:55:46.624559       1 shared_informer.go:356] "Caches are synced" controller="kubernetes-service-cidr-controller"
	I1124 08:55:46.624611       1 default_servicecidr_controller.go:137] Shutting down kubernetes-service-cidr-controller
	I1124 08:55:46.624734       1 shared_informer.go:356] "Caches are synced" controller="configmaps"
	I1124 08:55:46.637331       1 shared_informer.go:356] "Caches are synced" controller="ipallocator-repair-controller"
	I1124 08:55:46.644590       1 controller.go:667] quota admission added evaluator for: leases.coordination.k8s.io
	I1124 08:55:47.218727       1 storage_scheduling.go:111] all system priority classes are created successfully or already exist.
	I1124 08:55:47.442763       1 controller.go:667] quota admission added evaluator for: serviceaccounts
	W1124 08:55:47.629531       1 lease.go:265] Resetting endpoints for master service "kubernetes" to [192.168.49.2]
	I1124 08:55:47.631201       1 controller.go:667] quota admission added evaluator for: endpoints
	I1124 08:55:47.640973       1 controller.go:667] quota admission added evaluator for: endpointslices.discovery.k8s.io
	I1124 08:55:48.340758       1 controller.go:667] quota admission added evaluator for: daemonsets.apps
	I1124 08:55:48.559757       1 controller.go:667] quota admission added evaluator for: deployments.apps
	I1124 08:55:48.672814       1 controller.go:667] quota admission added evaluator for: roles.rbac.authorization.k8s.io
	I1124 08:55:48.686003       1 controller.go:667] quota admission added evaluator for: rolebindings.rbac.authorization.k8s.io
	I1124 08:55:57.253802       1 controller.go:667] quota admission added evaluator for: replicasets.apps
	I1124 08:56:15.389377       1 alloc.go:328] "allocated clusterIPs" service="default/invalid-svc" clusterIPs={"IPv4":"10.100.46.187"}
	E1124 08:56:19.614283       1 watch.go:272] "Unhandled Error" err="http2: stream closed" logger="UnhandledError"
	I1124 08:56:24.710981       1 alloc.go:328] "allocated clusterIPs" service="default/hello-node" clusterIPs={"IPv4":"10.110.108.17"}
	I1124 08:56:29.396574       1 alloc.go:328] "allocated clusterIPs" service="default/nginx-svc" clusterIPs={"IPv4":"10.99.196.76"}
	
	
	==> kube-controller-manager [8a2c1ed065d22c98f59a6c1c4a6816d5635a1993d4b79170db5bdf930e6a5464] <==
	I1124 08:54:15.869534       1 shared_informer.go:356] "Caches are synced" controller="PV protection"
	I1124 08:54:15.869543       1 shared_informer.go:356] "Caches are synced" controller="ReplicaSet"
	I1124 08:54:15.869777       1 shared_informer.go:356] "Caches are synced" controller="namespace"
	I1124 08:54:15.878398       1 range_allocator.go:428] "Set node PodCIDR" logger="node-ipam-controller" node="functional-941011" podCIDRs=["10.244.0.0/24"]
	I1124 08:54:15.882870       1 shared_informer.go:356] "Caches are synced" controller="HPA"
	I1124 08:54:15.886073       1 shared_informer.go:356] "Caches are synced" controller="garbage collector"
	I1124 08:54:15.890370       1 shared_informer.go:356] "Caches are synced" controller="PVC protection"
	I1124 08:54:15.903201       1 shared_informer.go:356] "Caches are synced" controller="garbage collector"
	I1124 08:54:15.903228       1 garbagecollector.go:154] "Garbage collector: all resource monitors have synced" logger="garbage-collector-controller"
	I1124 08:54:15.903236       1 garbagecollector.go:157] "Proceeding to collect garbage" logger="garbage-collector-controller"
	I1124 08:54:15.908644       1 shared_informer.go:356] "Caches are synced" controller="disruption"
	I1124 08:54:15.909119       1 shared_informer.go:356] "Caches are synced" controller="validatingadmissionpolicy-status"
	I1124 08:54:15.910226       1 shared_informer.go:356] "Caches are synced" controller="endpoint_slice"
	I1124 08:54:15.910325       1 shared_informer.go:356] "Caches are synced" controller="ClusterRoleAggregator"
	I1124 08:54:15.910394       1 shared_informer.go:356] "Caches are synced" controller="attach detach"
	I1124 08:54:15.910704       1 shared_informer.go:356] "Caches are synced" controller="job"
	I1124 08:54:15.910793       1 shared_informer.go:356] "Caches are synced" controller="GC"
	I1124 08:54:15.910845       1 shared_informer.go:356] "Caches are synced" controller="legacy-service-account-token-cleaner"
	I1124 08:54:15.910971       1 shared_informer.go:356] "Caches are synced" controller="VAC protection"
	I1124 08:54:15.911017       1 shared_informer.go:356] "Caches are synced" controller="taint-eviction-controller"
	I1124 08:54:15.913830       1 shared_informer.go:356] "Caches are synced" controller="endpoint_slice_mirroring"
	I1124 08:54:15.916079       1 shared_informer.go:356] "Caches are synced" controller="persistent volume"
	I1124 08:54:15.918449       1 shared_informer.go:356] "Caches are synced" controller="resource quota"
	I1124 08:54:15.919891       1 shared_informer.go:356] "Caches are synced" controller="ephemeral"
	I1124 08:55:00.864980       1 node_lifecycle_controller.go:1044] "Controller detected that some Nodes are Ready. Exiting master disruption mode" logger="node-lifecycle-controller"
	
	
	==> kube-controller-manager [bb129591935391164c1c4d497b52259d48968c471089ff98292655891beffe48] <==
	I1124 08:55:50.025630       1 shared_informer.go:356] "Caches are synced" controller="resource quota"
	I1124 08:55:50.026699       1 shared_informer.go:356] "Caches are synced" controller="job"
	I1124 08:55:50.026699       1 shared_informer.go:356] "Caches are synced" controller="GC"
	I1124 08:55:50.029985       1 shared_informer.go:356] "Caches are synced" controller="node"
	I1124 08:55:50.030091       1 range_allocator.go:177] "Sending events to api server" logger="node-ipam-controller"
	I1124 08:55:50.030176       1 range_allocator.go:183] "Starting range CIDR allocator" logger="node-ipam-controller"
	I1124 08:55:50.030245       1 shared_informer.go:349] "Waiting for caches to sync" controller="cidrallocator"
	I1124 08:55:50.030328       1 shared_informer.go:356] "Caches are synced" controller="cidrallocator"
	I1124 08:55:50.031540       1 shared_informer.go:356] "Caches are synced" controller="resource quota"
	I1124 08:55:50.034699       1 shared_informer.go:356] "Caches are synced" controller="certificate-csrsigning-kubelet-client"
	I1124 08:55:50.034721       1 shared_informer.go:356] "Caches are synced" controller="certificate-csrsigning-kubelet-serving"
	I1124 08:55:50.035887       1 shared_informer.go:356] "Caches are synced" controller="disruption"
	I1124 08:55:50.039119       1 shared_informer.go:356] "Caches are synced" controller="certificate-csrsigning-kube-apiserver-client"
	I1124 08:55:50.040438       1 shared_informer.go:356] "Caches are synced" controller="certificate-csrsigning-legacy-unknown"
	I1124 08:55:50.043772       1 shared_informer.go:356] "Caches are synced" controller="TTL"
	I1124 08:55:50.049118       1 shared_informer.go:356] "Caches are synced" controller="VAC protection"
	I1124 08:55:50.051401       1 shared_informer.go:356] "Caches are synced" controller="garbage collector"
	I1124 08:55:50.054626       1 shared_informer.go:356] "Caches are synced" controller="ephemeral"
	I1124 08:55:50.060982       1 shared_informer.go:356] "Caches are synced" controller="endpoint_slice_mirroring"
	I1124 08:55:50.068706       1 shared_informer.go:356] "Caches are synced" controller="TTL after finished"
	I1124 08:55:50.075379       1 shared_informer.go:356] "Caches are synced" controller="resource_claim"
	I1124 08:55:50.077489       1 shared_informer.go:356] "Caches are synced" controller="endpoint"
	I1124 08:55:50.093310       1 shared_informer.go:356] "Caches are synced" controller="garbage collector"
	I1124 08:55:50.093553       1 garbagecollector.go:154] "Garbage collector: all resource monitors have synced" logger="garbage-collector-controller"
	I1124 08:55:50.093642       1 garbagecollector.go:157] "Proceeding to collect garbage" logger="garbage-collector-controller"
	
	
	==> kube-proxy [62155860688a446bc4f0edf281779e28f4ecf113640dbf9995722ce6bfe4cd55] <==
	I1124 08:54:17.565304       1 server_linux.go:53] "Using iptables proxy"
	I1124 08:54:17.660540       1 shared_informer.go:349] "Waiting for caches to sync" controller="node informer cache"
	I1124 08:54:17.760879       1 shared_informer.go:356] "Caches are synced" controller="node informer cache"
	I1124 08:54:17.760920       1 server.go:219] "Successfully retrieved NodeIPs" NodeIPs=["192.168.49.2"]
	E1124 08:54:17.761072       1 server.go:256] "Kube-proxy configuration may be incomplete or incorrect" err="nodePortAddresses is unset; NodePort connections will be accepted on all local IPs. Consider using `--nodeport-addresses primary`"
	I1124 08:54:17.852239       1 server.go:265] "kube-proxy running in dual-stack mode" primary ipFamily="IPv4"
	I1124 08:54:17.852307       1 server_linux.go:132] "Using iptables Proxier"
	I1124 08:54:17.866599       1 proxier.go:242] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses" ipFamily="IPv4"
	I1124 08:54:17.866936       1 server.go:527] "Version info" version="v1.34.2"
	I1124 08:54:17.866952       1 server.go:529] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I1124 08:54:17.868618       1 config.go:200] "Starting service config controller"
	I1124 08:54:17.868629       1 shared_informer.go:349] "Waiting for caches to sync" controller="service config"
	I1124 08:54:17.868646       1 config.go:106] "Starting endpoint slice config controller"
	I1124 08:54:17.868649       1 shared_informer.go:349] "Waiting for caches to sync" controller="endpoint slice config"
	I1124 08:54:17.868660       1 config.go:403] "Starting serviceCIDR config controller"
	I1124 08:54:17.868666       1 shared_informer.go:349] "Waiting for caches to sync" controller="serviceCIDR config"
	I1124 08:54:17.876964       1 config.go:309] "Starting node config controller"
	I1124 08:54:17.876996       1 shared_informer.go:349] "Waiting for caches to sync" controller="node config"
	I1124 08:54:17.877013       1 shared_informer.go:356] "Caches are synced" controller="node config"
	I1124 08:54:17.969733       1 shared_informer.go:356] "Caches are synced" controller="service config"
	I1124 08:54:17.969806       1 shared_informer.go:356] "Caches are synced" controller="endpoint slice config"
	I1124 08:54:17.972675       1 shared_informer.go:356] "Caches are synced" controller="serviceCIDR config"
	
	
	==> kube-proxy [73cbc10088baee047f5af65dd9d3da14101d5c79193d1084620f4ef78567007c] <==
	I1124 08:55:29.304214       1 shared_informer.go:349] "Waiting for caches to sync" controller="node informer cache"
	E1124 08:55:29.305266       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-941011&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1124 08:55:30.871934       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-941011&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1124 08:55:32.698486       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-941011&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1124 08:55:38.932652       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://control-plane.minikube.internal:8441/api/v1/nodes?fieldSelector=metadata.name%3Dfunctional-941011&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	I1124 08:55:46.604339       1 shared_informer.go:356] "Caches are synced" controller="node informer cache"
	I1124 08:55:46.604382       1 server.go:219] "Successfully retrieved NodeIPs" NodeIPs=["192.168.49.2"]
	E1124 08:55:46.604726       1 server.go:256] "Kube-proxy configuration may be incomplete or incorrect" err="nodePortAddresses is unset; NodePort connections will be accepted on all local IPs. Consider using `--nodeport-addresses primary`"
	I1124 08:55:46.633087       1 server.go:265] "kube-proxy running in dual-stack mode" primary ipFamily="IPv4"
	I1124 08:55:46.633170       1 server_linux.go:132] "Using iptables Proxier"
	I1124 08:55:46.637676       1 proxier.go:242] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses" ipFamily="IPv4"
	I1124 08:55:46.638100       1 server.go:527] "Version info" version="v1.34.2"
	I1124 08:55:46.638127       1 server.go:529] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I1124 08:55:46.640035       1 config.go:200] "Starting service config controller"
	I1124 08:55:46.640207       1 shared_informer.go:349] "Waiting for caches to sync" controller="service config"
	I1124 08:55:46.640350       1 config.go:106] "Starting endpoint slice config controller"
	I1124 08:55:46.640418       1 shared_informer.go:349] "Waiting for caches to sync" controller="endpoint slice config"
	I1124 08:55:46.640509       1 config.go:403] "Starting serviceCIDR config controller"
	I1124 08:55:46.640794       1 shared_informer.go:349] "Waiting for caches to sync" controller="serviceCIDR config"
	I1124 08:55:46.647186       1 config.go:309] "Starting node config controller"
	I1124 08:55:46.647270       1 shared_informer.go:349] "Waiting for caches to sync" controller="node config"
	I1124 08:55:46.647299       1 shared_informer.go:356] "Caches are synced" controller="node config"
	I1124 08:55:46.741106       1 shared_informer.go:356] "Caches are synced" controller="serviceCIDR config"
	I1124 08:55:46.741112       1 shared_informer.go:356] "Caches are synced" controller="service config"
	I1124 08:55:46.741150       1 shared_informer.go:356] "Caches are synced" controller="endpoint slice config"
	
	
	==> kube-scheduler [39dbd124ad9131afd2e38d1cc6019bc4e02106f43653bf39b293c3e257811124] <==
	E1124 08:54:08.898071       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ResourceClaim: resourceclaims.resource.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"resourceclaims\" in API group \"resource.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ResourceClaim"
	E1124 08:54:08.898417       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:kube-scheduler\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service"
	E1124 08:54:08.898585       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User \"system:kube-scheduler\" cannot list resource \"persistentvolumes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolume"
	E1124 08:54:08.898858       1 reflector.go:205] "Failed to watch" err="failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User \"system:kube-scheduler\" cannot list resource \"statefulsets\" in API group \"apps\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.StatefulSet"
	E1124 08:54:08.898950       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver"
	E1124 08:54:08.899058       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User \"system:kube-scheduler\" cannot list resource \"replicationcontrollers\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ReplicationController"
	E1124 08:54:08.899111       1 reflector.go:205] "Failed to watch" err="failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"storageclasses\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.StorageClass"
	E1124 08:54:09.739885       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Namespace: namespaces is forbidden: User \"system:kube-scheduler\" cannot list resource \"namespaces\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Namespace"
	E1124 08:54:09.749915       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User \"system:kube-scheduler\" cannot list resource \"replicasets\" in API group \"apps\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ReplicaSet"
	E1124 08:54:09.907520       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User \"system:kube-scheduler\" cannot list resource \"persistentvolumes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolume"
	E1124 08:54:09.970254       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User \"system:kube-scheduler\" cannot list resource \"persistentvolumeclaims\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolumeClaim"
	E1124 08:54:10.012374       1 reflector.go:205] "Failed to watch" err="failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"storageclasses\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.StorageClass"
	E1124 08:54:10.028349       1 reflector.go:205] "Failed to watch" err="failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User \"system:kube-scheduler\" cannot list resource \"statefulsets\" in API group \"apps\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.StatefulSet"
	E1124 08:54:10.089620       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Pod: pods is forbidden: User \"system:kube-scheduler\" cannot list resource \"pods\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Pod"
	E1124 08:54:10.145424       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver"
	E1124 08:54:10.145452       1 reflector.go:205] "Failed to watch" err="failed to list *v1.DeviceClass: deviceclasses.resource.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"deviceclasses\" in API group \"resource.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.DeviceClass"
	E1124 08:54:10.166448       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User \"system:kube-scheduler\" cannot list resource \"replicationcontrollers\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ReplicationController"
	E1124 08:54:10.168975       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:kube-scheduler\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service"
	E1124 08:54:10.472917       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"extension-apiserver-authentication\" is forbidden: User \"system:kube-scheduler\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"kube-system\"" logger="UnhandledError" reflector="runtime/asm_arm64.s:1223" type="*v1.ConfigMap"
	I1124 08:54:12.282734       1 shared_informer.go:356] "Caches are synced" controller="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I1124 08:55:28.427975       1 secure_serving.go:259] Stopped listening on 127.0.0.1:10259
	I1124 08:55:28.428004       1 server.go:263] "[graceful-termination] secure server has stopped listening"
	I1124 08:55:28.428018       1 configmap_cafile_content.go:226] "Shutting down controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I1124 08:55:28.428088       1 server.go:265] "[graceful-termination] secure server is exiting"
	E1124 08:55:28.428102       1 run.go:72] "command failed" err="finished without leader elect"
	
	
	==> kube-scheduler [4db14bc0f3747bb08ccdf3a4232220b42a95e683884234f9fff57a09e95b88c3] <==
	E1124 08:55:36.598347       1 reflector.go:205] "Failed to watch" err="failed to list *v1.VolumeAttachment: Get \"https://192.168.49.2:8441/apis/storage.k8s.io/v1/volumeattachments?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.VolumeAttachment"
	E1124 08:55:36.838482       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PodDisruptionBudget: Get \"https://192.168.49.2:8441/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PodDisruptionBudget"
	E1124 08:55:37.242275       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Namespace: Get \"https://192.168.49.2:8441/api/v1/namespaces?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Namespace"
	E1124 08:55:37.413389       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://192.168.49.2:8441/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver"
	E1124 08:55:37.515451       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://192.168.49.2:8441/api/v1/services?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service"
	E1124 08:55:37.711691       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Pod: Get \"https://192.168.49.2:8441/api/v1/pods?fieldSelector=status.phase%21%3DSucceeded%2Cstatus.phase%21%3DFailed&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Pod"
	E1124 08:55:38.082012       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ConfigMap: Get \"https://192.168.49.2:8441/api/v1/namespaces/kube-system/configmaps?fieldSelector=metadata.name%3Dextension-apiserver-authentication&limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="runtime/asm_arm64.s:1223" type="*v1.ConfigMap"
	E1124 08:55:38.154030       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ResourceClaim: Get \"https://192.168.49.2:8441/apis/resource.k8s.io/v1/resourceclaims?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ResourceClaim"
	E1124 08:55:38.280385       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ResourceSlice: Get \"https://192.168.49.2:8441/apis/resource.k8s.io/v1/resourceslices?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ResourceSlice"
	E1124 08:55:38.363163       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSINode: Get \"https://192.168.49.2:8441/apis/storage.k8s.io/v1/csinodes?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSINode"
	E1124 08:55:38.536440       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://192.168.49.2:8441/api/v1/nodes?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node"
	E1124 08:55:38.785077       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIStorageCapacity: Get \"https://192.168.49.2:8441/apis/storage.k8s.io/v1/csistoragecapacities?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIStorageCapacity"
	E1124 08:55:38.930056       1 reflector.go:205] "Failed to watch" err="failed to list *v1.DeviceClass: Get \"https://192.168.49.2:8441/apis/resource.k8s.io/v1/deviceclasses?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.DeviceClass"
	E1124 08:55:38.985957       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolumeClaim: Get \"https://192.168.49.2:8441/api/v1/persistentvolumeclaims?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolumeClaim"
	E1124 08:55:39.039078       1 reflector.go:205] "Failed to watch" err="failed to list *v1.StatefulSet: Get \"https://192.168.49.2:8441/apis/apps/v1/statefulsets?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.StatefulSet"
	E1124 08:55:39.348953       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ReplicaSet: Get \"https://192.168.49.2:8441/apis/apps/v1/replicasets?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ReplicaSet"
	E1124 08:55:39.714746       1 reflector.go:205] "Failed to watch" err="failed to list *v1.StorageClass: Get \"https://192.168.49.2:8441/apis/storage.k8s.io/v1/storageclasses?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.StorageClass"
	E1124 08:55:39.840410       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolume: Get \"https://192.168.49.2:8441/api/v1/persistentvolumes?limit=500&resourceVersion=0\": dial tcp 192.168.49.2:8441: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolume"
	E1124 08:55:46.556944       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ResourceClaim: resourceclaims.resource.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"resourceclaims\" in API group \"resource.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ResourceClaim"
	E1124 08:55:46.557177       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User \"system:kube-scheduler\" cannot list resource \"replicationcontrollers\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ReplicationController"
	E1124 08:55:46.557352       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Pod: pods is forbidden: User \"system:kube-scheduler\" cannot list resource \"pods\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Pod"
	E1124 08:55:46.557551       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PodDisruptionBudget: poddisruptionbudgets.policy is forbidden: User \"system:kube-scheduler\" cannot list resource \"poddisruptionbudgets\" in API group \"policy\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PodDisruptionBudget"
	E1124 08:55:46.557775       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver"
	E1124 08:55:46.558034       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIStorageCapacity: csistoragecapacities.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csistoragecapacities\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIStorageCapacity"
	I1124 08:55:47.832870       1 shared_informer.go:356] "Caches are synced" controller="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	
	
	==> kubelet <==
	Nov 24 08:59:24 functional-941011 kubelet[4469]: E1124 08:59:24.370554    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"myfrontend\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/nginx\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/library/nginx:latest\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:553f64aecdc31b5bf944521731cd70e35da4faed96b2b7548a3d8e2598c52a42: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/sp-pod" podUID="8483abe7-6e51-4bf9-9b52-141abe46cd3e"
	Nov 24 08:59:35 functional-941011 kubelet[4469]: E1124 08:59:35.370407    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nginx\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/nginx:alpine\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/library/nginx:alpine\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:7391b3732e7f7ccd23ff1d02fbeadcde496f374d7460ad8e79260f8f6d2c9f90: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/nginx-svc" podUID="d266cb95-cc24-4a91-a764-df9ddabaf208"
	Nov 24 08:59:37 functional-941011 kubelet[4469]: E1124 08:59:37.801650    4469 log.go:32] "PullImage from image service failed" err=<
	Nov 24 08:59:37 functional-941011 kubelet[4469]:         rpc error: code = Unknown desc = failed to pull and unpack image "docker.io/library/nginx:latest": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:553f64aecdc31b5bf944521731cd70e35da4faed96b2b7548a3d8e2598c52a42: 429 Too Many Requests
	Nov 24 08:59:37 functional-941011 kubelet[4469]:         toomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit
	Nov 24 08:59:37 functional-941011 kubelet[4469]:  > image="docker.io/nginx:latest"
	Nov 24 08:59:37 functional-941011 kubelet[4469]: E1124 08:59:37.801704    4469 kuberuntime_image.go:43] "Failed to pull image" err=<
	Nov 24 08:59:37 functional-941011 kubelet[4469]:         failed to pull and unpack image "docker.io/library/nginx:latest": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:553f64aecdc31b5bf944521731cd70e35da4faed96b2b7548a3d8e2598c52a42: 429 Too Many Requests
	Nov 24 08:59:37 functional-941011 kubelet[4469]:         toomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit
	Nov 24 08:59:37 functional-941011 kubelet[4469]:  > image="docker.io/nginx:latest"
	Nov 24 08:59:37 functional-941011 kubelet[4469]: E1124 08:59:37.801774    4469 kuberuntime_manager.go:1449] "Unhandled Error" err=<
	Nov 24 08:59:37 functional-941011 kubelet[4469]:         container myfrontend start failed in pod sp-pod_default(8483abe7-6e51-4bf9-9b52-141abe46cd3e): ErrImagePull: failed to pull and unpack image "docker.io/library/nginx:latest": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:553f64aecdc31b5bf944521731cd70e35da4faed96b2b7548a3d8e2598c52a42: 429 Too Many Requests
	Nov 24 08:59:37 functional-941011 kubelet[4469]:         toomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit
	Nov 24 08:59:37 functional-941011 kubelet[4469]:  > logger="UnhandledError"
	Nov 24 08:59:37 functional-941011 kubelet[4469]: E1124 08:59:37.801807    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"myfrontend\" with ErrImagePull: \"failed to pull and unpack image \\\"docker.io/library/nginx:latest\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:553f64aecdc31b5bf944521731cd70e35da4faed96b2b7548a3d8e2598c52a42: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/sp-pod" podUID="8483abe7-6e51-4bf9-9b52-141abe46cd3e"
	Nov 24 08:59:47 functional-941011 kubelet[4469]: E1124 08:59:47.371068    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nginx\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/nginx:alpine\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/library/nginx:alpine\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:7391b3732e7f7ccd23ff1d02fbeadcde496f374d7460ad8e79260f8f6d2c9f90: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/nginx-svc" podUID="d266cb95-cc24-4a91-a764-df9ddabaf208"
	Nov 24 08:59:49 functional-941011 kubelet[4469]: E1124 08:59:49.370570    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"myfrontend\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/nginx\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/library/nginx:latest\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:553f64aecdc31b5bf944521731cd70e35da4faed96b2b7548a3d8e2598c52a42: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/sp-pod" podUID="8483abe7-6e51-4bf9-9b52-141abe46cd3e"
	Nov 24 09:00:00 functional-941011 kubelet[4469]: E1124 09:00:00.373216    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nginx\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/nginx:alpine\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/library/nginx:alpine\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:7391b3732e7f7ccd23ff1d02fbeadcde496f374d7460ad8e79260f8f6d2c9f90: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/nginx-svc" podUID="d266cb95-cc24-4a91-a764-df9ddabaf208"
	Nov 24 09:00:04 functional-941011 kubelet[4469]: E1124 09:00:04.374650    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"myfrontend\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/nginx\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/library/nginx:latest\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:553f64aecdc31b5bf944521731cd70e35da4faed96b2b7548a3d8e2598c52a42: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/sp-pod" podUID="8483abe7-6e51-4bf9-9b52-141abe46cd3e"
	Nov 24 09:00:15 functional-941011 kubelet[4469]: E1124 09:00:15.370875    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nginx\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/nginx:alpine\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/library/nginx:alpine\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:7391b3732e7f7ccd23ff1d02fbeadcde496f374d7460ad8e79260f8f6d2c9f90: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/nginx-svc" podUID="d266cb95-cc24-4a91-a764-df9ddabaf208"
	Nov 24 09:00:16 functional-941011 kubelet[4469]: E1124 09:00:16.370085    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"myfrontend\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/nginx\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/library/nginx:latest\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:553f64aecdc31b5bf944521731cd70e35da4faed96b2b7548a3d8e2598c52a42: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/sp-pod" podUID="8483abe7-6e51-4bf9-9b52-141abe46cd3e"
	Nov 24 09:00:27 functional-941011 kubelet[4469]: E1124 09:00:27.370518    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"myfrontend\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/nginx\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/library/nginx:latest\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:553f64aecdc31b5bf944521731cd70e35da4faed96b2b7548a3d8e2598c52a42: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/sp-pod" podUID="8483abe7-6e51-4bf9-9b52-141abe46cd3e"
	Nov 24 09:00:30 functional-941011 kubelet[4469]: E1124 09:00:30.370733    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nginx\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/nginx:alpine\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/library/nginx:alpine\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:7391b3732e7f7ccd23ff1d02fbeadcde496f374d7460ad8e79260f8f6d2c9f90: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/nginx-svc" podUID="d266cb95-cc24-4a91-a764-df9ddabaf208"
	Nov 24 09:00:41 functional-941011 kubelet[4469]: E1124 09:00:41.369890    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"myfrontend\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/nginx\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/library/nginx:latest\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:553f64aecdc31b5bf944521731cd70e35da4faed96b2b7548a3d8e2598c52a42: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/sp-pod" podUID="8483abe7-6e51-4bf9-9b52-141abe46cd3e"
	Nov 24 09:00:43 functional-941011 kubelet[4469]: E1124 09:00:43.371967    4469 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nginx\" with ImagePullBackOff: \"Back-off pulling image \\\"docker.io/nginx:alpine\\\": ErrImagePull: failed to pull and unpack image \\\"docker.io/library/nginx:alpine\\\": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:7391b3732e7f7ccd23ff1d02fbeadcde496f374d7460ad8e79260f8f6d2c9f90: 429 Too Many Requests\\ntoomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit\"" pod="default/nginx-svc" podUID="d266cb95-cc24-4a91-a764-df9ddabaf208"
	
	
	==> storage-provisioner [b6a84160fb5a462413dc19bd1857be8f2613f26a584ad28f2b8052ac97712585] <==
	I1124 08:55:29.097755       1 storage_provisioner.go:116] Initializing the minikube storage provisioner...
	F1124 08:55:29.099748       1 main.go:39] error getting server version: Get "https://10.96.0.1:443/version?timeout=32s": dial tcp 10.96.0.1:443: connect: connection refused
	
	
	==> storage-provisioner [e068a2929b9a7acaf417417b25ba4835b5824322af5bcbddaa3f41f7d5cb8575] <==
	W1124 09:00:18.467922       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:00:20.471834       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:00:20.476186       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:00:22.478949       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:00:22.490073       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:00:24.493479       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:00:24.498590       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:00:26.501965       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:00:26.508783       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:00:28.512024       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:00:28.516838       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:00:30.520866       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:00:30.525535       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:00:32.529036       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:00:32.535765       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:00:34.538637       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:00:34.543830       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:00:36.546479       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:00:36.553186       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:00:38.555746       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:00:38.560175       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:00:40.564384       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:00:40.569111       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:00:42.572738       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1124 09:00:42.577370       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-941011 -n functional-941011
helpers_test.go:269: (dbg) Run:  kubectl --context functional-941011 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:280: non-running pods: nginx-svc sp-pod
helpers_test.go:282: ======> post-mortem[TestFunctional/parallel/PersistentVolumeClaim]: describe non-running pods <======
helpers_test.go:285: (dbg) Run:  kubectl --context functional-941011 describe pod nginx-svc sp-pod
helpers_test.go:290: (dbg) kubectl --context functional-941011 describe pod nginx-svc sp-pod:

                                                
                                                
-- stdout --
	Name:             nginx-svc
	Namespace:        default
	Priority:         0
	Service Account:  default
	Node:             functional-941011/192.168.49.2
	Start Time:       Mon, 24 Nov 2025 08:56:29 +0000
	Labels:           run=nginx-svc
	Annotations:      <none>
	Status:           Pending
	IP:               10.244.0.5
	IPs:
	  IP:  10.244.0.5
	Containers:
	  nginx:
	    Container ID:   
	    Image:          docker.io/nginx:alpine
	    Image ID:       
	    Port:           80/TCP
	    Host Port:      0/TCP
	    State:          Waiting
	      Reason:       ImagePullBackOff
	    Ready:          False
	    Restart Count:  0
	    Environment:    <none>
	    Mounts:
	      /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-qt757 (ro)
	Conditions:
	  Type                        Status
	  PodReadyToStartContainers   True 
	  Initialized                 True 
	  Ready                       False 
	  ContainersReady             False 
	  PodScheduled                True 
	Volumes:
	  kube-api-access-qt757:
	    Type:                    Projected (a volume that contains injected data from multiple sources)
	    TokenExpirationSeconds:  3607
	    ConfigMapName:           kube-root-ca.crt
	    Optional:                false
	    DownwardAPI:             true
	QoS Class:                   BestEffort
	Node-Selectors:              <none>
	Tolerations:                 node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
	                             node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
	Events:
	  Type     Reason     Age                    From               Message
	  ----     ------     ----                   ----               -------
	  Normal   Scheduled  4m15s                  default-scheduler  Successfully assigned default/nginx-svc to functional-941011
	  Warning  Failed     2m45s (x3 over 4m14s)  kubelet            Failed to pull image "docker.io/nginx:alpine": failed to pull and unpack image "docker.io/library/nginx:alpine": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:b3c656d55d7ad751196f21b7fd2e8d4da9cb430e32f646adcf92441b72f82b14: 429 Too Many Requests
	toomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit
	  Normal   Pulling  82s (x5 over 4m15s)  kubelet  Pulling image "docker.io/nginx:alpine"
	  Warning  Failed   82s (x5 over 4m14s)  kubelet  Error: ErrImagePull
	  Warning  Failed   82s (x2 over 3m36s)  kubelet  Failed to pull image "docker.io/nginx:alpine": failed to pull and unpack image "docker.io/library/nginx:alpine": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:7391b3732e7f7ccd23ff1d02fbeadcde496f374d7460ad8e79260f8f6d2c9f90: 429 Too Many Requests
	toomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit
	  Warning  Failed   14s (x15 over 4m14s)  kubelet  Error: ImagePullBackOff
	  Normal   BackOff  1s (x16 over 4m14s)   kubelet  Back-off pulling image "docker.io/nginx:alpine"
	
	
	Name:             sp-pod
	Namespace:        default
	Priority:         0
	Service Account:  default
	Node:             functional-941011/192.168.49.2
	Start Time:       Mon, 24 Nov 2025 08:56:41 +0000
	Labels:           test=storage-provisioner
	Annotations:      <none>
	Status:           Pending
	IP:               10.244.0.6
	IPs:
	  IP:  10.244.0.6
	Containers:
	  myfrontend:
	    Container ID:   
	    Image:          docker.io/nginx
	    Image ID:       
	    Port:           <none>
	    Host Port:      <none>
	    State:          Waiting
	      Reason:       ImagePullBackOff
	    Ready:          False
	    Restart Count:  0
	    Environment:    <none>
	    Mounts:
	      /tmp/mount from mypd (rw)
	      /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-26mp7 (ro)
	Conditions:
	  Type                        Status
	  PodReadyToStartContainers   True 
	  Initialized                 True 
	  Ready                       False 
	  ContainersReady             False 
	  PodScheduled                True 
	Volumes:
	  mypd:
	    Type:       PersistentVolumeClaim (a reference to a PersistentVolumeClaim in the same namespace)
	    ClaimName:  myclaim
	    ReadOnly:   false
	  kube-api-access-26mp7:
	    Type:                    Projected (a volume that contains injected data from multiple sources)
	    TokenExpirationSeconds:  3607
	    ConfigMapName:           kube-root-ca.crt
	    Optional:                false
	    DownwardAPI:             true
	QoS Class:                   BestEffort
	Node-Selectors:              <none>
	Tolerations:                 node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
	                             node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
	Events:
	  Type     Reason     Age                 From               Message
	  ----     ------     ----                ----               -------
	  Normal   Scheduled  4m3s                default-scheduler  Successfully assigned default/sp-pod to functional-941011
	  Normal   Pulling    67s (x5 over 4m3s)  kubelet            Pulling image "docker.io/nginx"
	  Warning  Failed     67s (x5 over 4m2s)  kubelet            Failed to pull image "docker.io/nginx": failed to pull and unpack image "docker.io/library/nginx:latest": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:553f64aecdc31b5bf944521731cd70e35da4faed96b2b7548a3d8e2598c52a42: 429 Too Many Requests
	toomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit
	  Warning  Failed   67s (x5 over 4m2s)  kubelet  Error: ErrImagePull
	  Normal   BackOff  3s (x15 over 4m2s)  kubelet  Back-off pulling image "docker.io/nginx"
	  Warning  Failed   3s (x15 over 4m2s)  kubelet  Error: ImagePullBackOff

                                                
                                                
-- /stdout --
helpers_test.go:293: <<< TestFunctional/parallel/PersistentVolumeClaim FAILED: end of post-mortem logs <<<
helpers_test.go:294: ---------------------/post-mortem---------------------------------
--- FAIL: TestFunctional/parallel/PersistentVolumeClaim (249.67s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup (240.83s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup
functional_test_tunnel_test.go:212: (dbg) Run:  kubectl --context functional-941011 apply -f testdata/testsvc.yaml
functional_test_tunnel_test.go:216: (dbg) TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup: waiting 4m0s for pods matching "run=nginx-svc" in namespace "default" ...
helpers_test.go:352: "nginx-svc" [d266cb95-cc24-4a91-a764-df9ddabaf208] Pending
helpers_test.go:352: "nginx-svc" [d266cb95-cc24-4a91-a764-df9ddabaf208] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
functional_test_tunnel_test.go:216: ***** TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup: pod "run=nginx-svc" failed to start within 4m0s: context deadline exceeded ****
functional_test_tunnel_test.go:216: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-941011 -n functional-941011
functional_test_tunnel_test.go:216: TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup: showing logs for failed pods as of 2025-11-24 09:00:29.740478325 +0000 UTC m=+1045.107784863
functional_test_tunnel_test.go:216: (dbg) Run:  kubectl --context functional-941011 describe po nginx-svc -n default
functional_test_tunnel_test.go:216: (dbg) kubectl --context functional-941011 describe po nginx-svc -n default:
Name:             nginx-svc
Namespace:        default
Priority:         0
Service Account:  default
Node:             functional-941011/192.168.49.2
Start Time:       Mon, 24 Nov 2025 08:56:29 +0000
Labels:           run=nginx-svc
Annotations:      <none>
Status:           Pending
IP:               10.244.0.5
IPs:
IP:  10.244.0.5
Containers:
nginx:
Container ID:   
Image:          docker.io/nginx:alpine
Image ID:       
Port:           80/TCP
Host Port:      0/TCP
State:          Waiting
Reason:       ImagePullBackOff
Ready:          False
Restart Count:  0
Environment:    <none>
Mounts:
/var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-qt757 (ro)
Conditions:
Type                        Status
PodReadyToStartContainers   True 
Initialized                 True 
Ready                       False 
ContainersReady             False 
PodScheduled                True 
Volumes:
kube-api-access-qt757:
Type:                    Projected (a volume that contains injected data from multiple sources)
TokenExpirationSeconds:  3607
ConfigMapName:           kube-root-ca.crt
Optional:                false
DownwardAPI:             true
QoS Class:                   BestEffort
Node-Selectors:              <none>
Tolerations:                 node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
Events:
Type     Reason     Age                    From               Message
----     ------     ----                   ----               -------
Normal   Scheduled  4m                     default-scheduler  Successfully assigned default/nginx-svc to functional-941011
Warning  Failed     2m30s (x3 over 3m59s)  kubelet            Failed to pull image "docker.io/nginx:alpine": failed to pull and unpack image "docker.io/library/nginx:alpine": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:b3c656d55d7ad751196f21b7fd2e8d4da9cb430e32f646adcf92441b72f82b14: 429 Too Many Requests
toomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit
Normal   Pulling  67s (x5 over 4m)     kubelet  Pulling image "docker.io/nginx:alpine"
Warning  Failed   67s (x5 over 3m59s)  kubelet  Error: ErrImagePull
Warning  Failed   67s (x2 over 3m21s)  kubelet  Failed to pull image "docker.io/nginx:alpine": failed to pull and unpack image "docker.io/library/nginx:alpine": failed to copy: httpReadSeeker: failed open: unexpected status from GET request to https://registry-1.docker.io/v2/library/nginx/manifests/sha256:7391b3732e7f7ccd23ff1d02fbeadcde496f374d7460ad8e79260f8f6d2c9f90: 429 Too Many Requests
toomanyrequests: You have reached your unauthenticated pull rate limit. https://www.docker.com/increase-rate-limit
Normal   BackOff  14s (x14 over 3m59s)  kubelet  Back-off pulling image "docker.io/nginx:alpine"
Warning  Failed   14s (x14 over 3m59s)  kubelet  Error: ImagePullBackOff
functional_test_tunnel_test.go:216: (dbg) Run:  kubectl --context functional-941011 logs nginx-svc -n default
functional_test_tunnel_test.go:216: (dbg) Non-zero exit: kubectl --context functional-941011 logs nginx-svc -n default: exit status 1 (101.360346ms)

                                                
                                                
** stderr ** 
	Error from server (BadRequest): container "nginx" in pod "nginx-svc" is waiting to start: trying and failing to pull image

                                                
                                                
** /stderr **
functional_test_tunnel_test.go:216: kubectl --context functional-941011 logs nginx-svc -n default: exit status 1
functional_test_tunnel_test.go:217: wait: run=nginx-svc within 4m0s: context deadline exceeded
--- FAIL: TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup (240.83s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/AccessDirect (87.3s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/AccessDirect
I1124 09:00:29.936118 1654467 retry.go:31] will retry after 3.980250141s: Temporary Error: Get "http:": http: no Host in request URL
I1124 09:00:33.917281 1654467 retry.go:31] will retry after 2.567211298s: Temporary Error: Get "http:": http: no Host in request URL
I1124 09:00:36.484675 1654467 retry.go:31] will retry after 9.839725448s: Temporary Error: Get "http:": http: no Host in request URL
functional_test_tunnel_test.go:288: failed to hit nginx at "http://": Temporary Error: Get "http:": http: no Host in request URL
functional_test_tunnel_test.go:290: (dbg) Run:  kubectl --context functional-941011 get svc nginx-svc
functional_test_tunnel_test.go:294: failed to kubectl get svc nginx-svc:
NAME        TYPE           CLUSTER-IP     EXTERNAL-IP    PORT(S)        AGE
nginx-svc   LoadBalancer   10.99.196.76   10.99.196.76   80:30914/TCP   5m28s
functional_test_tunnel_test.go:301: expected body to contain "Welcome to nginx!", but got *""*
--- FAIL: TestFunctional/parallel/TunnelCmd/serial/AccessDirect (87.30s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/StartWithProxy (510.27s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/StartWithProxy
functional_test.go:2239: (dbg) Run:  out/minikube-linux-arm64 start -p functional-291288 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0
E1124 09:11:03.604805 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:11:24.717448 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:11:24.723930 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:11:24.735656 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:11:24.757117 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:11:24.798610 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:11:24.880194 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:11:25.041882 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:11:25.363701 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:11:26.005040 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:11:27.287124 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:11:29.850046 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:11:34.971652 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:11:45.213898 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:12:05.695773 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:12:26.672916 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:12:46.657244 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:14:08.579379 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:16:03.608526 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:16:24.717293 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:16:52.420811 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
functional_test.go:2239: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-291288 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0: exit status 109 (8m28.794603378s)

                                                
                                                
-- stdout --
	* [functional-291288] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=21978
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/21978-1652607/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/21978-1652607/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the docker driver based on user configuration
	* Using Docker driver with root privileges
	* Starting "functional-291288" primary control-plane node in "functional-291288" cluster
	* Pulling base image v0.0.48-1763789673-21948 ...
	* Found network options:
	  - HTTP_PROXY=localhost:38719
	* Please see https://minikube.sigs.k8s.io/docs/handbook/vpn_and_proxy/ for more details
	* Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! Local proxy ignored: not passing HTTP_PROXY=localhost:38719 to docker env.
	! You appear to be using a proxy, but your NO_PROXY environment does not include the minikube IP (192.168.49.2).
	! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Generating "apiserver-kubelet-client" certificate and key
	[certs] Generating "front-proxy-ca" certificate and key
	[certs] Generating "front-proxy-client" certificate and key
	[certs] Generating "etcd/ca" certificate and key
	[certs] Generating "etcd/server" certificate and key
	[certs] etcd/server serving cert is signed for DNS names [functional-291288 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	[certs] Generating "etcd/peer" certificate and key
	[certs] etcd/peer serving cert is signed for DNS names [functional-291288 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	[certs] Generating "etcd/healthcheck-client" certificate and key
	[certs] Generating "apiserver-etcd-client" certificate and key
	[certs] Generating "sa" key and public key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001254358s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	* 
	X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000282192s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000282192s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	* Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	* Related issue: https://github.com/kubernetes/minikube/issues/4172

                                                
                                                
** /stderr **
functional_test.go:2241: failed minikube start. args "out/minikube-linux-arm64 start -p functional-291288 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0": exit status 109
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/StartWithProxy]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/StartWithProxy]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect functional-291288
helpers_test.go:243: (dbg) docker inspect functional-291288:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52",
	        "Created": "2025-11-24T09:10:51.896020191Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 1695240,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-11-24T09:10:51.968983407Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:572c983e466f1f784136812eef5cc59ac623db764bc7704d3676c4643993fd08",
	        "ResolvConfPath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/hostname",
	        "HostsPath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/hosts",
	        "LogPath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52-json.log",
	        "Name": "/functional-291288",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "functional-291288:/var",
	                "/lib/modules:/lib/modules:ro"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-291288",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52",
	                "LowerDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7-init/diff:/var/lib/docker/overlay2/bf294786c7ade59cb761c7bdcc27045362c3779b00a569b685443f34ade17737/diff",
	                "MergedDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7/merged",
	                "UpperDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7/diff",
	                "WorkDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "volume",
	                "Name": "functional-291288",
	                "Source": "/var/lib/docker/volumes/functional-291288/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            },
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-291288",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-291288",
	                "name.minikube.sigs.k8s.io": "functional-291288",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "09c1c2eef0dca6362dde63b4cbc372c0cfa3e4fd084b8745043d8b88925691bf",
	            "SandboxKey": "/var/run/docker/netns/09c1c2eef0dc",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34684"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34685"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34688"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34686"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34687"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-291288": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "7e:49:22:0b:f9:2c",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "1e8f91e8ad9f46b831bbb1b0589b0022d940ee9875e64a648dc80612f3ca93dc",
	                    "EndpointID": "5de5ca8ccb07584b21e6e4e30dba12e0233e8d28c3e48e705cddffe75263b337",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-291288",
	                        "70848be15fcc"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-291288 -n functional-291288
helpers_test.go:247: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-291288 -n functional-291288: exit status 6 (312.365845ms)

                                                
                                                
-- stdout --
	Running
	WARNING: Your kubectl is pointing to stale minikube-vm.
	To fix the kubectl context, run `minikube update-context`

                                                
                                                
-- /stdout --
** stderr ** 
	E1124 09:19:19.716070 1701009 status.go:458] kubeconfig endpoint: get endpoint: "functional-291288" does not appear in /home/jenkins/minikube-integration/21978-1652607/kubeconfig

                                                
                                                
** /stderr **
helpers_test.go:247: status error: exit status 6 (may be ok)
helpers_test.go:252: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/StartWithProxy FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/StartWithProxy]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 logs -n 25
helpers_test.go:260: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/StartWithProxy logs: 
-- stdout --
	
	==> Audit <==
	┌────────────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│    COMMAND     │                                                                          ARGS                                                                           │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├────────────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ ssh            │ functional-941011 ssh -- ls -la /mount-9p                                                                                                               │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │ 24 Nov 25 09:02 UTC │
	│ ssh            │ functional-941011 ssh sudo umount -f /mount-9p                                                                                                          │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ mount          │ -p functional-941011 /tmp/TestFunctionalparallelMountCmdVerifyCleanup1964745822/001:/mount1 --alsologtostderr -v=1                                      │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ ssh            │ functional-941011 ssh findmnt -T /mount1                                                                                                                │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │ 24 Nov 25 09:02 UTC │
	│ mount          │ -p functional-941011 /tmp/TestFunctionalparallelMountCmdVerifyCleanup1964745822/001:/mount2 --alsologtostderr -v=1                                      │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ mount          │ -p functional-941011 /tmp/TestFunctionalparallelMountCmdVerifyCleanup1964745822/001:/mount3 --alsologtostderr -v=1                                      │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ ssh            │ functional-941011 ssh findmnt -T /mount2                                                                                                                │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │ 24 Nov 25 09:02 UTC │
	│ ssh            │ functional-941011 ssh findmnt -T /mount3                                                                                                                │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │ 24 Nov 25 09:02 UTC │
	│ mount          │ -p functional-941011 --kill=true                                                                                                                        │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ start          │ -p functional-941011 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd                                         │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ start          │ -p functional-941011 --dry-run --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd                                                   │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ start          │ -p functional-941011 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd                                         │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ dashboard      │ --url --port 36195 -p functional-941011 --alsologtostderr -v=1                                                                                          │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ update-context │ functional-941011 update-context --alsologtostderr -v=2                                                                                                 │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ update-context │ functional-941011 update-context --alsologtostderr -v=2                                                                                                 │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ update-context │ functional-941011 update-context --alsologtostderr -v=2                                                                                                 │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ image          │ functional-941011 image ls --format short --alsologtostderr                                                                                             │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ image          │ functional-941011 image ls --format yaml --alsologtostderr                                                                                              │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ ssh            │ functional-941011 ssh pgrep buildkitd                                                                                                                   │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │                     │
	│ image          │ functional-941011 image build -t localhost/my-image:functional-941011 testdata/build --alsologtostderr                                                  │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ image          │ functional-941011 image ls                                                                                                                              │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ image          │ functional-941011 image ls --format json --alsologtostderr                                                                                              │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ image          │ functional-941011 image ls --format table --alsologtostderr                                                                                             │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ delete         │ -p functional-941011                                                                                                                                    │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:10 UTC │ 24 Nov 25 09:10 UTC │
	│ start          │ -p functional-291288 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0 │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:10 UTC │                     │
	└────────────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/11/24 09:10:50
	Running on machine: ip-172-31-30-239
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1124 09:10:50.641436 1694914 out.go:360] Setting OutFile to fd 1 ...
	I1124 09:10:50.641549 1694914 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:10:50.641553 1694914 out.go:374] Setting ErrFile to fd 2...
	I1124 09:10:50.641556 1694914 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:10:50.641813 1694914 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
	I1124 09:10:50.642205 1694914 out.go:368] Setting JSON to false
	I1124 09:10:50.643101 1694914 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":28380,"bootTime":1763947071,"procs":151,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1124 09:10:50.643162 1694914 start.go:143] virtualization:  
	I1124 09:10:50.647685 1694914 out.go:179] * [functional-291288] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1124 09:10:50.652425 1694914 out.go:179]   - MINIKUBE_LOCATION=21978
	I1124 09:10:50.652532 1694914 notify.go:221] Checking for updates...
	I1124 09:10:50.658963 1694914 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1124 09:10:50.662111 1694914 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 09:10:50.665126 1694914 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21978-1652607/.minikube
	I1124 09:10:50.668197 1694914 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1124 09:10:50.671337 1694914 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1124 09:10:50.674730 1694914 driver.go:422] Setting default libvirt URI to qemu:///system
	I1124 09:10:50.700065 1694914 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1124 09:10:50.700178 1694914 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 09:10:50.772192 1694914 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:24 OomKillDisable:true NGoroutines:42 SystemTime:2025-11-24 09:10:50.762715906 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 09:10:50.772288 1694914 docker.go:319] overlay module found
	I1124 09:10:50.777484 1694914 out.go:179] * Using the docker driver based on user configuration
	I1124 09:10:50.780487 1694914 start.go:309] selected driver: docker
	I1124 09:10:50.780499 1694914 start.go:927] validating driver "docker" against <nil>
	I1124 09:10:50.780512 1694914 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1124 09:10:50.781282 1694914 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 09:10:50.841185 1694914 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:24 OomKillDisable:true NGoroutines:42 SystemTime:2025-11-24 09:10:50.831743317 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 09:10:50.841334 1694914 start_flags.go:327] no existing cluster config was found, will generate one from the flags 
	I1124 09:10:50.841555 1694914 start_flags.go:992] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1124 09:10:50.844479 1694914 out.go:179] * Using Docker driver with root privileges
	I1124 09:10:50.847498 1694914 cni.go:84] Creating CNI manager for ""
	I1124 09:10:50.847561 1694914 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1124 09:10:50.847569 1694914 start_flags.go:336] Found "CNI" CNI - setting NetworkPlugin=cni
	I1124 09:10:50.847643 1694914 start.go:353] cluster config:
	{Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local C
ontainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP
: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:10:50.852652 1694914 out.go:179] * Starting "functional-291288" primary control-plane node in "functional-291288" cluster
	I1124 09:10:50.855533 1694914 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1124 09:10:50.858541 1694914 out.go:179] * Pulling base image v0.0.48-1763789673-21948 ...
	I1124 09:10:50.861525 1694914 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1124 09:10:50.861616 1694914 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f in local docker daemon
	I1124 09:10:50.880984 1694914 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f in local docker daemon, skipping pull
	I1124 09:10:50.880996 1694914 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f exists in daemon, skipping load
	W1124 09:10:50.918561 1694914 preload.go:144] https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.35.0-beta.0/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	W1124 09:10:51.063759 1694914 preload.go:144] https://github.com/kubernetes-sigs/minikube-preloads/releases/download/v18/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	I1124 09:10:51.064032 1694914 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:10:51.064141 1694914 profile.go:143] Saving config to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/config.json ...
	I1124 09:10:51.064167 1694914 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/config.json: {Name:mk47a22af26b5b5312bacd53340b3fea1079ae82 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 09:10:51.064336 1694914 cache.go:243] Successfully downloaded all kic artifacts
	I1124 09:10:51.064360 1694914 start.go:360] acquireMachinesLock for functional-291288: {Name:mk85384dc057570e1f34db593d357cea738652c4 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:10:51.064400 1694914 start.go:364] duration metric: took 31.303µs to acquireMachinesLock for "functional-291288"
	I1124 09:10:51.064417 1694914 start.go:93] Provisioning new machine with config: &{Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: AP
IServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNS
Log:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1124 09:10:51.064470 1694914 start.go:125] createHost starting for "" (driver="docker")
	I1124 09:10:51.069847 1694914 out.go:252] * Creating docker container (CPUs=2, Memory=4096MB) ...
	W1124 09:10:51.070147 1694914 out.go:285] ! Local proxy ignored: not passing HTTP_PROXY=localhost:38719 to docker env.
	I1124 09:10:51.070170 1694914 start.go:159] libmachine.API.Create for "functional-291288" (driver="docker")
	I1124 09:10:51.070207 1694914 client.go:173] LocalClient.Create starting
	I1124 09:10:51.070306 1694914 main.go:143] libmachine: Reading certificate data from /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem
	I1124 09:10:51.070343 1694914 main.go:143] libmachine: Decoding PEM data...
	I1124 09:10:51.070358 1694914 main.go:143] libmachine: Parsing certificate...
	I1124 09:10:51.070415 1694914 main.go:143] libmachine: Reading certificate data from /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem
	I1124 09:10:51.070430 1694914 main.go:143] libmachine: Decoding PEM data...
	I1124 09:10:51.070440 1694914 main.go:143] libmachine: Parsing certificate...
	I1124 09:10:51.070845 1694914 cli_runner.go:164] Run: docker network inspect functional-291288 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	W1124 09:10:51.096950 1694914 cli_runner.go:211] docker network inspect functional-291288 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}" returned with exit code 1
	I1124 09:10:51.097031 1694914 network_create.go:284] running [docker network inspect functional-291288] to gather additional debugging logs...
	I1124 09:10:51.097047 1694914 cli_runner.go:164] Run: docker network inspect functional-291288
	W1124 09:10:51.118526 1694914 cli_runner.go:211] docker network inspect functional-291288 returned with exit code 1
	I1124 09:10:51.118547 1694914 network_create.go:287] error running [docker network inspect functional-291288]: docker network inspect functional-291288: exit status 1
	stdout:
	[]
	
	stderr:
	Error response from daemon: network functional-291288 not found
	I1124 09:10:51.118560 1694914 network_create.go:289] output of [docker network inspect functional-291288]: -- stdout --
	[]
	
	-- /stdout --
	** stderr ** 
	Error response from daemon: network functional-291288 not found
	
	** /stderr **
	I1124 09:10:51.118660 1694914 cli_runner.go:164] Run: docker network inspect bridge --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1124 09:10:51.153020 1694914 network.go:206] using free private subnet 192.168.49.0/24: &{IP:192.168.49.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.49.0/24 Gateway:192.168.49.1 ClientMin:192.168.49.2 ClientMax:192.168.49.254 Broadcast:192.168.49.255 IsPrivate:true Interface:{IfaceName: IfaceIPv4: IfaceMTU:0 IfaceMAC:} reservation:0x4001a86120}
	I1124 09:10:51.153053 1694914 network_create.go:124] attempt to create docker network functional-291288 192.168.49.0/24 with gateway 192.168.49.1 and MTU of 1500 ...
	I1124 09:10:51.153114 1694914 cli_runner.go:164] Run: docker network create --driver=bridge --subnet=192.168.49.0/24 --gateway=192.168.49.1 -o --ip-masq -o --icc -o com.docker.network.driver.mtu=1500 --label=created_by.minikube.sigs.k8s.io=true --label=name.minikube.sigs.k8s.io=functional-291288 functional-291288
	I1124 09:10:51.222640 1694914 network_create.go:108] docker network functional-291288 192.168.49.0/24 created
	I1124 09:10:51.222662 1694914 kic.go:121] calculated static IP "192.168.49.2" for the "functional-291288" container
	I1124 09:10:51.222749 1694914 cli_runner.go:164] Run: docker ps -a --format {{.Names}}
	I1124 09:10:51.239024 1694914 cli_runner.go:164] Run: docker volume create functional-291288 --label name.minikube.sigs.k8s.io=functional-291288 --label created_by.minikube.sigs.k8s.io=true
	I1124 09:10:51.247585 1694914 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:10:51.266509 1694914 oci.go:103] Successfully created a docker volume functional-291288
	I1124 09:10:51.266596 1694914 cli_runner.go:164] Run: docker run --rm --name functional-291288-preload-sidecar --label created_by.minikube.sigs.k8s.io=true --label name.minikube.sigs.k8s.io=functional-291288 --entrypoint /usr/bin/test -v functional-291288:/var gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f -d /var/lib
	I1124 09:10:51.431760 1694914 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:10:51.604828 1694914 cache.go:107] acquiring lock: {Name:mk22a10f0ce1f3295b61e7e76c455d0494a3e278 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:10:51.604921 1694914 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 exists
	I1124 09:10:51.604929 1694914 cache.go:96] cache image "gcr.io/k8s-minikube/storage-provisioner:v5" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5" took 119.903µs
	I1124 09:10:51.604942 1694914 cache.go:80] save to tar file gcr.io/k8s-minikube/storage-provisioner:v5 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 succeeded
	I1124 09:10:51.604955 1694914 cache.go:107] acquiring lock: {Name:mk1cf42e67442503a46c578224bd3cb68bf682d4 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:10:51.604984 1694914 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 exists
	I1124 09:10:51.604988 1694914 cache.go:96] cache image "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0" took 35.036µs
	I1124 09:10:51.604993 1694914 cache.go:80] save to tar file registry.k8s.io/kube-apiserver:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 succeeded
	I1124 09:10:51.605001 1694914 cache.go:107] acquiring lock: {Name:mkfdc49c8e68aee34cee0c9d441ae8a4dca675c6 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:10:51.605028 1694914 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 exists
	I1124 09:10:51.605032 1694914 cache.go:96] cache image "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0" took 32.41µs
	I1124 09:10:51.605039 1694914 cache.go:80] save to tar file registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 succeeded
	I1124 09:10:51.605048 1694914 cache.go:107] acquiring lock: {Name:mkdbf38e05e2c47c1a7a906a2236e9e7020a94c5 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:10:51.605074 1694914 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 exists
	I1124 09:10:51.605079 1694914 cache.go:96] cache image "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0" took 31.385µs
	I1124 09:10:51.605083 1694914 cache.go:80] save to tar file registry.k8s.io/kube-scheduler:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 succeeded
	I1124 09:10:51.605091 1694914 cache.go:107] acquiring lock: {Name:mk80fdbe7cdb5bc17c2a82b4ecfd00214559a435 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:10:51.605114 1694914 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 exists
	I1124 09:10:51.605117 1694914 cache.go:96] cache image "registry.k8s.io/kube-proxy:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0" took 27.537µs
	I1124 09:10:51.605122 1694914 cache.go:80] save to tar file registry.k8s.io/kube-proxy:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 succeeded
	I1124 09:10:51.605129 1694914 cache.go:107] acquiring lock: {Name:mk85f1502dbb97830776608fb729eb3605e112e6 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:10:51.605155 1694914 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 exists
	I1124 09:10:51.605158 1694914 cache.go:96] cache image "registry.k8s.io/pause:3.10.1" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1" took 30.384µs
	I1124 09:10:51.605162 1694914 cache.go:80] save to tar file registry.k8s.io/pause:3.10.1 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 succeeded
	I1124 09:10:51.605171 1694914 cache.go:107] acquiring lock: {Name:mk46ce3b59d7e062b3dbc8a90fe5b4231f256471 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:10:51.605205 1694914 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0 exists
	I1124 09:10:51.605208 1694914 cache.go:96] cache image "registry.k8s.io/etcd:3.5.24-0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0" took 39.549µs
	I1124 09:10:51.605217 1694914 cache.go:80] save to tar file registry.k8s.io/etcd:3.5.24-0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0 succeeded
	I1124 09:10:51.605224 1694914 cache.go:107] acquiring lock: {Name:mk726502cb84c177b2e14fee88512325761511c5 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:10:51.605252 1694914 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 exists
	I1124 09:10:51.605263 1694914 cache.go:96] cache image "registry.k8s.io/coredns/coredns:v1.13.1" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1" took 33.075µs
	I1124 09:10:51.605267 1694914 cache.go:80] save to tar file registry.k8s.io/coredns/coredns:v1.13.1 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 succeeded
	I1124 09:10:51.605273 1694914 cache.go:87] Successfully saved all images to host disk.
	I1124 09:10:51.830239 1694914 oci.go:107] Successfully prepared a docker volume functional-291288
	I1124 09:10:51.830298 1694914 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	W1124 09:10:51.830436 1694914 cgroups_linux.go:77] Your kernel does not support swap limit capabilities or the cgroup is not mounted.
	I1124 09:10:51.830579 1694914 cli_runner.go:164] Run: docker info --format "'{{json .SecurityOptions}}'"
	I1124 09:10:51.880837 1694914 cli_runner.go:164] Run: docker run -d -t --privileged --security-opt seccomp=unconfined --tmpfs /tmp --tmpfs /run -v /lib/modules:/lib/modules:ro --hostname functional-291288 --name functional-291288 --label created_by.minikube.sigs.k8s.io=true --label name.minikube.sigs.k8s.io=functional-291288 --label role.minikube.sigs.k8s.io= --label mode.minikube.sigs.k8s.io=functional-291288 --network functional-291288 --ip 192.168.49.2 --volume functional-291288:/var --security-opt apparmor=unconfined --memory=4096mb --cpus=2 -e container=docker --expose 8441 --publish=127.0.0.1::8441 --publish=127.0.0.1::22 --publish=127.0.0.1::2376 --publish=127.0.0.1::5000 --publish=127.0.0.1::32443 gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f
	I1124 09:10:52.207560 1694914 cli_runner.go:164] Run: docker container inspect functional-291288 --format={{.State.Running}}
	I1124 09:10:52.233197 1694914 cli_runner.go:164] Run: docker container inspect functional-291288 --format={{.State.Status}}
	I1124 09:10:52.253429 1694914 cli_runner.go:164] Run: docker exec functional-291288 stat /var/lib/dpkg/alternatives/iptables
	I1124 09:10:52.305864 1694914 oci.go:144] the created container "functional-291288" has a running status.
	I1124 09:10:52.305898 1694914 kic.go:225] Creating ssh key for kic: /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa...
	I1124 09:10:52.932585 1694914 kic_runner.go:191] docker (temp): /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa.pub --> /home/docker/.ssh/authorized_keys (381 bytes)
	I1124 09:10:52.951431 1694914 cli_runner.go:164] Run: docker container inspect functional-291288 --format={{.State.Status}}
	I1124 09:10:52.967474 1694914 kic_runner.go:93] Run: chown docker:docker /home/docker/.ssh/authorized_keys
	I1124 09:10:52.967485 1694914 kic_runner.go:114] Args: [docker exec --privileged functional-291288 chown docker:docker /home/docker/.ssh/authorized_keys]
	I1124 09:10:53.015680 1694914 cli_runner.go:164] Run: docker container inspect functional-291288 --format={{.State.Status}}
	I1124 09:10:53.033231 1694914 machine.go:94] provisionDockerMachine start ...
	I1124 09:10:53.033316 1694914 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:10:53.050444 1694914 main.go:143] libmachine: Using SSH client type: native
	I1124 09:10:53.050813 1694914 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34684 <nil> <nil>}
	I1124 09:10:53.050820 1694914 main.go:143] libmachine: About to run SSH command:
	hostname
	I1124 09:10:53.051495 1694914 main.go:143] libmachine: Error dialing TCP: ssh: handshake failed: EOF
	I1124 09:10:56.206008 1694914 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-291288
	
	I1124 09:10:56.206022 1694914 ubuntu.go:182] provisioning hostname "functional-291288"
	I1124 09:10:56.206094 1694914 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:10:56.222862 1694914 main.go:143] libmachine: Using SSH client type: native
	I1124 09:10:56.223187 1694914 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34684 <nil> <nil>}
	I1124 09:10:56.223197 1694914 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-291288 && echo "functional-291288" | sudo tee /etc/hostname
	I1124 09:10:56.383592 1694914 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-291288
	
	I1124 09:10:56.383663 1694914 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:10:56.400572 1694914 main.go:143] libmachine: Using SSH client type: native
	I1124 09:10:56.400871 1694914 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34684 <nil> <nil>}
	I1124 09:10:56.400885 1694914 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-291288' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-291288/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-291288' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1124 09:10:56.550743 1694914 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1124 09:10:56.550761 1694914 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/21978-1652607/.minikube CaCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/21978-1652607/.minikube}
	I1124 09:10:56.550790 1694914 ubuntu.go:190] setting up certificates
	I1124 09:10:56.550798 1694914 provision.go:84] configureAuth start
	I1124 09:10:56.550859 1694914 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-291288
	I1124 09:10:56.570037 1694914 provision.go:143] copyHostCerts
	I1124 09:10:56.570099 1694914 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem, removing ...
	I1124 09:10:56.570107 1694914 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem
	I1124 09:10:56.570184 1694914 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem (1078 bytes)
	I1124 09:10:56.570282 1694914 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem, removing ...
	I1124 09:10:56.570287 1694914 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem
	I1124 09:10:56.570312 1694914 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem (1123 bytes)
	I1124 09:10:56.570362 1694914 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem, removing ...
	I1124 09:10:56.570365 1694914 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem
	I1124 09:10:56.570392 1694914 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem (1679 bytes)
	I1124 09:10:56.570690 1694914 provision.go:117] generating server cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem org=jenkins.functional-291288 san=[127.0.0.1 192.168.49.2 functional-291288 localhost minikube]
	I1124 09:10:56.723012 1694914 provision.go:177] copyRemoteCerts
	I1124 09:10:56.723068 1694914 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1124 09:10:56.723109 1694914 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:10:56.744943 1694914 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:10:56.850228 1694914 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1124 09:10:56.867377 1694914 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1124 09:10:56.884330 1694914 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1124 09:10:56.901748 1694914 provision.go:87] duration metric: took 350.92548ms to configureAuth
	I1124 09:10:56.901765 1694914 ubuntu.go:206] setting minikube options for container-runtime
	I1124 09:10:56.901958 1694914 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1124 09:10:56.901963 1694914 machine.go:97] duration metric: took 3.868722077s to provisionDockerMachine
	I1124 09:10:56.901969 1694914 client.go:176] duration metric: took 5.831756782s to LocalClient.Create
	I1124 09:10:56.901992 1694914 start.go:167] duration metric: took 5.831822301s to libmachine.API.Create "functional-291288"
	I1124 09:10:56.901999 1694914 start.go:293] postStartSetup for "functional-291288" (driver="docker")
	I1124 09:10:56.902008 1694914 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1124 09:10:56.902057 1694914 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1124 09:10:56.902107 1694914 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:10:56.919574 1694914 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:10:57.023268 1694914 ssh_runner.go:195] Run: cat /etc/os-release
	I1124 09:10:57.026950 1694914 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1124 09:10:57.026970 1694914 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1124 09:10:57.026980 1694914 filesync.go:126] Scanning /home/jenkins/minikube-integration/21978-1652607/.minikube/addons for local assets ...
	I1124 09:10:57.027038 1694914 filesync.go:126] Scanning /home/jenkins/minikube-integration/21978-1652607/.minikube/files for local assets ...
	I1124 09:10:57.027131 1694914 filesync.go:149] local asset: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem -> 16544672.pem in /etc/ssl/certs
	I1124 09:10:57.027213 1694914 filesync.go:149] local asset: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/test/nested/copy/1654467/hosts -> hosts in /etc/test/nested/copy/1654467
	I1124 09:10:57.027262 1694914 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/1654467
	I1124 09:10:57.035125 1694914 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem --> /etc/ssl/certs/16544672.pem (1708 bytes)
	I1124 09:10:57.053210 1694914 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/test/nested/copy/1654467/hosts --> /etc/test/nested/copy/1654467/hosts (40 bytes)
	I1124 09:10:57.070594 1694914 start.go:296] duration metric: took 168.58196ms for postStartSetup
	I1124 09:10:57.070953 1694914 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-291288
	I1124 09:10:57.087414 1694914 profile.go:143] Saving config to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/config.json ...
	I1124 09:10:57.087673 1694914 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1124 09:10:57.087709 1694914 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:10:57.104072 1694914 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:10:57.207270 1694914 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1124 09:10:57.211817 1694914 start.go:128] duration metric: took 6.147333606s to createHost
	I1124 09:10:57.211833 1694914 start.go:83] releasing machines lock for "functional-291288", held for 6.147426317s
	I1124 09:10:57.211916 1694914 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-291288
	I1124 09:10:57.232459 1694914 out.go:179] * Found network options:
	I1124 09:10:57.235387 1694914 out.go:179]   - HTTP_PROXY=localhost:38719
	W1124 09:10:57.238315 1694914 out.go:285] ! You appear to be using a proxy, but your NO_PROXY environment does not include the minikube IP (192.168.49.2).
	I1124 09:10:57.240971 1694914 out.go:179] * Please see https://minikube.sigs.k8s.io/docs/handbook/vpn_and_proxy/ for more details
	I1124 09:10:57.243821 1694914 ssh_runner.go:195] Run: cat /version.json
	I1124 09:10:57.243864 1694914 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:10:57.243884 1694914 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1124 09:10:57.243949 1694914 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:10:57.264743 1694914 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:10:57.286580 1694914 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:10:57.370092 1694914 ssh_runner.go:195] Run: systemctl --version
	I1124 09:10:57.462312 1694914 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1124 09:10:57.467396 1694914 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1124 09:10:57.467472 1694914 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1124 09:10:57.494825 1694914 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist, /etc/cni/net.d/10-crio-bridge.conflist.disabled] bridge cni config(s)
	I1124 09:10:57.494838 1694914 start.go:496] detecting cgroup driver to use...
	I1124 09:10:57.494870 1694914 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1124 09:10:57.494922 1694914 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1124 09:10:57.510380 1694914 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1124 09:10:57.523801 1694914 docker.go:218] disabling cri-docker service (if available) ...
	I1124 09:10:57.523863 1694914 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1124 09:10:57.541965 1694914 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1124 09:10:57.560484 1694914 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1124 09:10:57.677316 1694914 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1124 09:10:57.800183 1694914 docker.go:234] disabling docker service ...
	I1124 09:10:57.800253 1694914 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1124 09:10:57.822222 1694914 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1124 09:10:57.835586 1694914 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1124 09:10:57.963235 1694914 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1124 09:10:58.102869 1694914 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1124 09:10:58.118398 1694914 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1124 09:10:58.134190 1694914 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:10:58.277375 1694914 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1124 09:10:58.286870 1694914 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1124 09:10:58.295651 1694914 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1124 09:10:58.295714 1694914 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1124 09:10:58.304512 1694914 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1124 09:10:58.313004 1694914 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1124 09:10:58.321700 1694914 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1124 09:10:58.330256 1694914 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1124 09:10:58.338442 1694914 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1124 09:10:58.348298 1694914 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1124 09:10:58.357485 1694914 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1124 09:10:58.367192 1694914 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1124 09:10:58.375166 1694914 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1124 09:10:58.382618 1694914 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1124 09:10:58.501160 1694914 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1124 09:10:58.595085 1694914 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1124 09:10:58.595149 1694914 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1124 09:10:58.599517 1694914 start.go:564] Will wait 60s for crictl version
	I1124 09:10:58.599571 1694914 ssh_runner.go:195] Run: which crictl
	I1124 09:10:58.603476 1694914 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1124 09:10:58.629489 1694914 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.1.5
	RuntimeApiVersion:  v1
	I1124 09:10:58.629549 1694914 ssh_runner.go:195] Run: containerd --version
	I1124 09:10:58.649939 1694914 ssh_runner.go:195] Run: containerd --version
	I1124 09:10:58.674545 1694914 out.go:179] * Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	I1124 09:10:58.677457 1694914 cli_runner.go:164] Run: docker network inspect functional-291288 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1124 09:10:58.694197 1694914 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1124 09:10:58.698045 1694914 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.49.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1124 09:10:58.707752 1694914 kubeadm.go:884] updating cluster {Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikub
eCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:fal
se CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1124 09:10:58.707906 1694914 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:10:58.877273 1694914 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:10:59.030277 1694914 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:10:59.190341 1694914 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1124 09:10:59.190418 1694914 ssh_runner.go:195] Run: sudo crictl images --output json
	I1124 09:10:59.213879 1694914 containerd.go:623] couldn't find preloaded image for "registry.k8s.io/kube-apiserver:v1.35.0-beta.0". assuming images are not preloaded.
	I1124 09:10:59.213893 1694914 cache_images.go:90] LoadCachedImages start: [registry.k8s.io/kube-apiserver:v1.35.0-beta.0 registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 registry.k8s.io/kube-scheduler:v1.35.0-beta.0 registry.k8s.io/kube-proxy:v1.35.0-beta.0 registry.k8s.io/pause:3.10.1 registry.k8s.io/etcd:3.5.24-0 registry.k8s.io/coredns/coredns:v1.13.1 gcr.io/k8s-minikube/storage-provisioner:v5]
	I1124 09:10:59.213951 1694914 image.go:138] retrieving image: gcr.io/k8s-minikube/storage-provisioner:v5
	I1124 09:10:59.213954 1694914 image.go:138] retrieving image: registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1124 09:10:59.213987 1694914 image.go:138] retrieving image: registry.k8s.io/pause:3.10.1
	I1124 09:10:59.214147 1694914 image.go:138] retrieving image: registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1124 09:10:59.214159 1694914 image.go:138] retrieving image: registry.k8s.io/etcd:3.5.24-0
	I1124 09:10:59.214227 1694914 image.go:138] retrieving image: registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1124 09:10:59.214243 1694914 image.go:138] retrieving image: registry.k8s.io/coredns/coredns:v1.13.1
	I1124 09:10:59.214314 1694914 image.go:138] retrieving image: registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1124 09:10:59.216217 1694914 image.go:181] daemon lookup for registry.k8s.io/etcd:3.5.24-0: Error response from daemon: No such image: registry.k8s.io/etcd:3.5.24-0
	I1124 09:10:59.216597 1694914 image.go:181] daemon lookup for registry.k8s.io/kube-proxy:v1.35.0-beta.0: Error response from daemon: No such image: registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1124 09:10:59.216735 1694914 image.go:181] daemon lookup for registry.k8s.io/kube-apiserver:v1.35.0-beta.0: Error response from daemon: No such image: registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1124 09:10:59.216845 1694914 image.go:181] daemon lookup for registry.k8s.io/coredns/coredns:v1.13.1: Error response from daemon: No such image: registry.k8s.io/coredns/coredns:v1.13.1
	I1124 09:10:59.216949 1694914 image.go:181] daemon lookup for registry.k8s.io/kube-scheduler:v1.35.0-beta.0: Error response from daemon: No such image: registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1124 09:10:59.217046 1694914 image.go:181] daemon lookup for gcr.io/k8s-minikube/storage-provisioner:v5: Error response from daemon: No such image: gcr.io/k8s-minikube/storage-provisioner:v5
	I1124 09:10:59.217249 1694914 image.go:181] daemon lookup for registry.k8s.io/pause:3.10.1: Error response from daemon: No such image: registry.k8s.io/pause:3.10.1
	I1124 09:10:59.217508 1694914 image.go:181] daemon lookup for registry.k8s.io/kube-controller-manager:v1.35.0-beta.0: Error response from daemon: No such image: registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1124 09:10:59.534217 1694914 containerd.go:267] Checking existence of image with name "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" and sha "ccd634d9bcc36ac6235e9c86676cd3a02c06afc3788a25f1bbf39ca7d44585f4"
	I1124 09:10:59.534292 1694914 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1124 09:10:59.542185 1694914 containerd.go:267] Checking existence of image with name "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" and sha "68b5f775f18769fcb77bd8474c80bda2050163b6c66f4551f352b7381b8ca5be"
	I1124 09:10:59.542858 1694914 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1124 09:10:59.553849 1694914 containerd.go:267] Checking existence of image with name "registry.k8s.io/pause:3.10.1" and sha "d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd"
	I1124 09:10:59.553918 1694914 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/pause:3.10.1
	I1124 09:10:59.559402 1694914 containerd.go:267] Checking existence of image with name "registry.k8s.io/etcd:3.5.24-0" and sha "1211402d28f5813ed906916bfcdd0a7404c2f9048ef5bb54387a6745bc410eca"
	I1124 09:10:59.559466 1694914 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/etcd:3.5.24-0
	I1124 09:10:59.559863 1694914 cache_images.go:118] "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" needs transfer: "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" does not exist at hash "ccd634d9bcc36ac6235e9c86676cd3a02c06afc3788a25f1bbf39ca7d44585f4" in container runtime
	I1124 09:10:59.559902 1694914 cri.go:218] Removing image: registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1124 09:10:59.559936 1694914 ssh_runner.go:195] Run: which crictl
	I1124 09:10:59.571930 1694914 containerd.go:267] Checking existence of image with name "registry.k8s.io/kube-proxy:v1.35.0-beta.0" and sha "404c2e12861777b763b8feaa316d36680fc68ad308a8d2f6e55f1bb981cdd904"
	I1124 09:10:59.571999 1694914 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1124 09:10:59.586638 1694914 containerd.go:267] Checking existence of image with name "registry.k8s.io/coredns/coredns:v1.13.1" and sha "e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf"
	I1124 09:10:59.586709 1694914 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/coredns/coredns:v1.13.1
	I1124 09:10:59.597029 1694914 cache_images.go:118] "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" needs transfer: "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" does not exist at hash "68b5f775f18769fcb77bd8474c80bda2050163b6c66f4551f352b7381b8ca5be" in container runtime
	I1124 09:10:59.597063 1694914 cri.go:218] Removing image: registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1124 09:10:59.597132 1694914 ssh_runner.go:195] Run: which crictl
	I1124 09:10:59.600407 1694914 cache_images.go:118] "registry.k8s.io/pause:3.10.1" needs transfer: "registry.k8s.io/pause:3.10.1" does not exist at hash "d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd" in container runtime
	I1124 09:10:59.600449 1694914 cri.go:218] Removing image: registry.k8s.io/pause:3.10.1
	I1124 09:10:59.600498 1694914 ssh_runner.go:195] Run: which crictl
	I1124 09:10:59.601563 1694914 containerd.go:267] Checking existence of image with name "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" and sha "16378741539f1be9c6e347d127537d379a6592587b09b4eb47964cb5c43a409b"
	I1124 09:10:59.601618 1694914 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1124 09:10:59.607563 1694914 cache_images.go:118] "registry.k8s.io/etcd:3.5.24-0" needs transfer: "registry.k8s.io/etcd:3.5.24-0" does not exist at hash "1211402d28f5813ed906916bfcdd0a7404c2f9048ef5bb54387a6745bc410eca" in container runtime
	I1124 09:10:59.607591 1694914 cri.go:218] Removing image: registry.k8s.io/etcd:3.5.24-0
	I1124 09:10:59.607592 1694914 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1124 09:10:59.607627 1694914 ssh_runner.go:195] Run: which crictl
	I1124 09:10:59.621427 1694914 cache_images.go:118] "registry.k8s.io/kube-proxy:v1.35.0-beta.0" needs transfer: "registry.k8s.io/kube-proxy:v1.35.0-beta.0" does not exist at hash "404c2e12861777b763b8feaa316d36680fc68ad308a8d2f6e55f1bb981cdd904" in container runtime
	I1124 09:10:59.621460 1694914 cri.go:218] Removing image: registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1124 09:10:59.621507 1694914 ssh_runner.go:195] Run: which crictl
	I1124 09:10:59.646631 1694914 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1124 09:10:59.646697 1694914 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/pause:3.10.1
	I1124 09:10:59.646741 1694914 cache_images.go:118] "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" needs transfer: "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" does not exist at hash "16378741539f1be9c6e347d127537d379a6592587b09b4eb47964cb5c43a409b" in container runtime
	I1124 09:10:59.646768 1694914 cri.go:218] Removing image: registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1124 09:10:59.646789 1694914 ssh_runner.go:195] Run: which crictl
	I1124 09:10:59.646845 1694914 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/etcd:3.5.24-0
	I1124 09:10:59.646891 1694914 cache_images.go:118] "registry.k8s.io/coredns/coredns:v1.13.1" needs transfer: "registry.k8s.io/coredns/coredns:v1.13.1" does not exist at hash "e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf" in container runtime
	I1124 09:10:59.646905 1694914 cri.go:218] Removing image: registry.k8s.io/coredns/coredns:v1.13.1
	I1124 09:10:59.646926 1694914 ssh_runner.go:195] Run: which crictl
	I1124 09:10:59.662741 1694914 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1124 09:10:59.662805 1694914 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1124 09:10:59.714166 1694914 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/coredns/coredns:v1.13.1
	I1124 09:10:59.714279 1694914 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1124 09:10:59.714335 1694914 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/pause:3.10.1
	I1124 09:10:59.714375 1694914 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1124 09:10:59.714424 1694914 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/etcd:3.5.24-0
	I1124 09:10:59.764913 1694914 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1124 09:10:59.764933 1694914 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1124 09:10:59.829666 1694914 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/etcd:3.5.24-0
	I1124 09:10:59.829748 1694914 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/coredns/coredns:v1.13.1
	I1124 09:10:59.829838 1694914 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1124 09:10:59.829896 1694914 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/pause:3.10.1
	I1124 09:10:59.829952 1694914 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1124 09:10:59.851256 1694914 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1124 09:10:59.856739 1694914 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0
	I1124 09:10:59.856856 1694914 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0
	I1124 09:10:59.923542 1694914 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1
	I1124 09:10:59.923630 1694914 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/pause_3.10.1
	I1124 09:10:59.923701 1694914 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/coredns/coredns:v1.13.1
	I1124 09:10:59.923751 1694914 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0
	I1124 09:10:59.923799 1694914 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0
	I1124 09:10:59.923851 1694914 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1124 09:10:59.923894 1694914 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0
	I1124 09:10:59.923937 1694914 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/etcd_3.5.24-0
	I1124 09:10:59.940600 1694914 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0
	I1124 09:10:59.940707 1694914 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0
	I1124 09:10:59.940788 1694914 ssh_runner.go:352] existence check for /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0: stat -c "%s %y" /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0': No such file or directory
	I1124 09:10:59.940813 1694914 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 --> /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0 (24689152 bytes)
	I1124 09:11:00.004558 1694914 ssh_runner.go:352] existence check for /var/lib/minikube/images/etcd_3.5.24-0: stat -c "%s %y" /var/lib/minikube/images/etcd_3.5.24-0: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/etcd_3.5.24-0': No such file or directory
	I1124 09:11:00.004603 1694914 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0 --> /var/lib/minikube/images/etcd_3.5.24-0 (21895168 bytes)
	I1124 09:11:00.004695 1694914 ssh_runner.go:352] existence check for /var/lib/minikube/images/pause_3.10.1: stat -c "%s %y" /var/lib/minikube/images/pause_3.10.1: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/pause_3.10.1': No such file or directory
	I1124 09:11:00.004709 1694914 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 --> /var/lib/minikube/images/pause_3.10.1 (268288 bytes)
	I1124 09:11:00.004773 1694914 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1
	I1124 09:11:00.004872 1694914 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/coredns_v1.13.1
	I1124 09:11:00.004939 1694914 ssh_runner.go:352] existence check for /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0: stat -c "%s %y" /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0': No such file or directory
	I1124 09:11:00.004950 1694914 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 --> /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0 (20672000 bytes)
	I1124 09:11:00.005006 1694914 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0
	I1124 09:11:00.005051 1694914 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0
	I1124 09:11:00.005114 1694914 ssh_runner.go:352] existence check for /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0: stat -c "%s %y" /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/kube-proxy_v1.35.0-beta.0': No such file or directory
	I1124 09:11:00.005124 1694914 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 --> /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0 (22432256 bytes)
	I1124 09:11:00.099103 1694914 ssh_runner.go:352] existence check for /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0: stat -c "%s %y" /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0': No such file or directory
	I1124 09:11:00.099140 1694914 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 --> /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0 (15401984 bytes)
	I1124 09:11:00.099208 1694914 ssh_runner.go:352] existence check for /var/lib/minikube/images/coredns_v1.13.1: stat -c "%s %y" /var/lib/minikube/images/coredns_v1.13.1: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/coredns_v1.13.1': No such file or directory
	I1124 09:11:00.099235 1694914 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 --> /var/lib/minikube/images/coredns_v1.13.1 (21178368 bytes)
	I1124 09:11:00.202973 1694914 containerd.go:285] Loading image: /var/lib/minikube/images/pause_3.10.1
	I1124 09:11:00.203047 1694914 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/pause_3.10.1
	W1124 09:11:00.500199 1694914 image.go:286] image gcr.io/k8s-minikube/storage-provisioner:v5 arch mismatch: want arm64 got amd64. fixing
	I1124 09:11:00.500324 1694914 containerd.go:267] Checking existence of image with name "gcr.io/k8s-minikube/storage-provisioner:v5" and sha "66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51"
	I1124 09:11:00.500414 1694914 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==gcr.io/k8s-minikube/storage-provisioner:v5
	I1124 09:11:00.612077 1694914 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 from cache
	I1124 09:11:00.637577 1694914 cache_images.go:118] "gcr.io/k8s-minikube/storage-provisioner:v5" needs transfer: "gcr.io/k8s-minikube/storage-provisioner:v5" does not exist at hash "66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51" in container runtime
	I1124 09:11:00.637612 1694914 cri.go:218] Removing image: gcr.io/k8s-minikube/storage-provisioner:v5
	I1124 09:11:00.637661 1694914 ssh_runner.go:195] Run: which crictl
	I1124 09:11:00.728105 1694914 containerd.go:285] Loading image: /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0
	I1124 09:11:00.728178 1694914 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0
	I1124 09:11:00.736575 1694914 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi gcr.io/k8s-minikube/storage-provisioner:v5
	I1124 09:11:01.996087 1694914 ssh_runner.go:235] Completed: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0: (1.26785493s)
	I1124 09:11:01.996105 1694914 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 from cache
	I1124 09:11:01.996124 1694914 containerd.go:285] Loading image: /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0
	I1124 09:11:01.996140 1694914 ssh_runner.go:235] Completed: sudo /usr/local/bin/crictl rmi gcr.io/k8s-minikube/storage-provisioner:v5: (1.25954481s)
	I1124 09:11:01.996171 1694914 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0
	I1124 09:11:01.996211 1694914 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi gcr.io/k8s-minikube/storage-provisioner:v5
	I1124 09:11:02.973093 1694914 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi gcr.io/k8s-minikube/storage-provisioner:v5
	I1124 09:11:02.973200 1694914 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 from cache
	I1124 09:11:02.973226 1694914 containerd.go:285] Loading image: /var/lib/minikube/images/etcd_3.5.24-0
	I1124 09:11:02.973252 1694914 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/etcd_3.5.24-0
	I1124 09:11:03.007895 1694914 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5
	I1124 09:11:03.008042 1694914 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/storage-provisioner_v5
	I1124 09:11:04.504690 1694914 ssh_runner.go:235] Completed: sudo ctr -n=k8s.io images import /var/lib/minikube/images/etcd_3.5.24-0: (1.531416636s)
	I1124 09:11:04.504706 1694914 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0 from cache
	I1124 09:11:04.504725 1694914 containerd.go:285] Loading image: /var/lib/minikube/images/coredns_v1.13.1
	I1124 09:11:04.504782 1694914 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/coredns_v1.13.1
	I1124 09:11:04.504840 1694914 ssh_runner.go:235] Completed: stat -c "%s %y" /var/lib/minikube/images/storage-provisioner_v5: (1.496789046s)
	I1124 09:11:04.504854 1694914 ssh_runner.go:352] existence check for /var/lib/minikube/images/storage-provisioner_v5: stat -c "%s %y" /var/lib/minikube/images/storage-provisioner_v5: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/storage-provisioner_v5': No such file or directory
	I1124 09:11:04.504867 1694914 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 --> /var/lib/minikube/images/storage-provisioner_v5 (8035840 bytes)
	I1124 09:11:05.498781 1694914 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 from cache
	I1124 09:11:05.498817 1694914 containerd.go:285] Loading image: /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0
	I1124 09:11:05.498890 1694914 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0
	I1124 09:11:06.569269 1694914 ssh_runner.go:235] Completed: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0: (1.070354692s)
	I1124 09:11:06.569286 1694914 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 from cache
	I1124 09:11:06.569304 1694914 containerd.go:285] Loading image: /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0
	I1124 09:11:06.569370 1694914 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0
	I1124 09:11:07.534017 1694914 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 from cache
	I1124 09:11:07.534052 1694914 containerd.go:285] Loading image: /var/lib/minikube/images/storage-provisioner_v5
	I1124 09:11:07.534104 1694914 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/storage-provisioner_v5
	I1124 09:11:07.884303 1694914 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 from cache
	I1124 09:11:07.884328 1694914 cache_images.go:125] Successfully loaded all cached images
	I1124 09:11:07.884332 1694914 cache_images.go:94] duration metric: took 8.670426913s to LoadCachedImages
	I1124 09:11:07.884344 1694914 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-beta.0 containerd true true} ...
	I1124 09:11:07.884435 1694914 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-291288 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1124 09:11:07.884513 1694914 ssh_runner.go:195] Run: sudo crictl info
	I1124 09:11:07.913455 1694914 cni.go:84] Creating CNI manager for ""
	I1124 09:11:07.913466 1694914 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1124 09:11:07.913487 1694914 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1124 09:11:07.913511 1694914 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-beta.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-291288 NodeName:functional-291288 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt Sta
ticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1124 09:11:07.913621 1694914 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-291288"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-beta.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1124 09:11:07.913691 1694914 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0
	I1124 09:11:07.921585 1694914 binaries.go:54] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.35.0-beta.0': No such file or directory
	
	Initiating transfer...
	I1124 09:11:07.921637 1694914 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.35.0-beta.0
	I1124 09:11:07.929430 1694914 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubectl?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubectl.sha256
	I1124 09:11:07.929515 1694914 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl
	I1124 09:11:07.929598 1694914 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubelet?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubelet.sha256
	I1124 09:11:07.929635 1694914 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1124 09:11:07.929728 1694914 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:11:07.929780 1694914 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubeadm
	I1124 09:11:07.934549 1694914 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.35.0-beta.0/kubectl': No such file or directory
	I1124 09:11:07.934607 1694914 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/linux/arm64/v1.35.0-beta.0/kubectl --> /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl (55181496 bytes)
	I1124 09:11:07.951473 1694914 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.35.0-beta.0/kubeadm: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubeadm: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.35.0-beta.0/kubeadm': No such file or directory
	I1124 09:11:07.951499 1694914 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/linux/arm64/v1.35.0-beta.0/kubeadm --> /var/lib/minikube/binaries/v1.35.0-beta.0/kubeadm (68354232 bytes)
	I1124 09:11:07.951562 1694914 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubelet
	I1124 09:11:07.972473 1694914 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.35.0-beta.0/kubelet: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubelet: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet': No such file or directory
	I1124 09:11:07.972499 1694914 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/linux/arm64/v1.35.0-beta.0/kubelet --> /var/lib/minikube/binaries/v1.35.0-beta.0/kubelet (54329636 bytes)
	I1124 09:11:08.780476 1694914 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1124 09:11:08.788489 1694914 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (328 bytes)
	I1124 09:11:08.802555 1694914 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (359 bytes)
	I1124 09:11:08.816871 1694914 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2237 bytes)
	I1124 09:11:08.830720 1694914 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1124 09:11:08.834271 1694914 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.49.2	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1124 09:11:08.843967 1694914 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1124 09:11:08.959074 1694914 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1124 09:11:08.975238 1694914 certs.go:69] Setting up /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288 for IP: 192.168.49.2
	I1124 09:11:08.975248 1694914 certs.go:195] generating shared ca certs ...
	I1124 09:11:08.975263 1694914 certs.go:227] acquiring lock for ca certs: {Name:mkbe540a30c4376a351176f7fe6fec044d058b09 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 09:11:08.975396 1694914 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.key
	I1124 09:11:08.975443 1694914 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.key
	I1124 09:11:08.975449 1694914 certs.go:257] generating profile certs ...
	I1124 09:11:08.975504 1694914 certs.go:364] generating signed profile cert for "minikube-user": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.key
	I1124 09:11:08.975514 1694914 crypto.go:68] Generating cert /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.crt with IP's: []
	I1124 09:11:09.228058 1694914 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.crt ...
	I1124 09:11:09.228074 1694914 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.crt: {Name:mk753e7f8e5e109ed09fd18edb17e6d6c7914d3c Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 09:11:09.228290 1694914 crypto.go:164] Writing key to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.key ...
	I1124 09:11:09.228297 1694914 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.key: {Name:mk97c6dc3b44e3dc3628ca31efc528eb37fc8bf2 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 09:11:09.228389 1694914 certs.go:364] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.key.5acb2515
	I1124 09:11:09.228400 1694914 crypto.go:68] Generating cert /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.crt.5acb2515 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.49.2]
	I1124 09:11:09.392562 1694914 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.crt.5acb2515 ...
	I1124 09:11:09.392584 1694914 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.crt.5acb2515: {Name:mk83c7376d3c7bb0daa94d4b98453d9516c00a62 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 09:11:09.392797 1694914 crypto.go:164] Writing key to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.key.5acb2515 ...
	I1124 09:11:09.392806 1694914 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.key.5acb2515: {Name:mk38534fb5da78d9cc7e8475b306c7ed19fa1d53 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 09:11:09.392896 1694914 certs.go:382] copying /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.crt.5acb2515 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.crt
	I1124 09:11:09.392975 1694914 certs.go:386] copying /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.key.5acb2515 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.key
	I1124 09:11:09.393029 1694914 certs.go:364] generating signed profile cert for "aggregator": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.key
	I1124 09:11:09.393042 1694914 crypto.go:68] Generating cert /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.crt with IP's: []
	I1124 09:11:09.671985 1694914 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.crt ...
	I1124 09:11:09.672000 1694914 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.crt: {Name:mk051f70cf66c671949045e5753e528a00631565 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 09:11:09.672198 1694914 crypto.go:164] Writing key to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.key ...
	I1124 09:11:09.672206 1694914 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.key: {Name:mkd583cac14bdd01514636cc154ec3a8aa8f1094 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 09:11:09.672405 1694914 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467.pem (1338 bytes)
	W1124 09:11:09.672447 1694914 certs.go:480] ignoring /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467_empty.pem, impossibly tiny 0 bytes
	I1124 09:11:09.672455 1694914 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem (1671 bytes)
	I1124 09:11:09.672481 1694914 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem (1078 bytes)
	I1124 09:11:09.672506 1694914 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem (1123 bytes)
	I1124 09:11:09.672529 1694914 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem (1679 bytes)
	I1124 09:11:09.672574 1694914 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem (1708 bytes)
	I1124 09:11:09.673137 1694914 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1124 09:11:09.692563 1694914 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1124 09:11:09.711428 1694914 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1124 09:11:09.730117 1694914 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1124 09:11:09.747862 1694914 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1124 09:11:09.765408 1694914 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1124 09:11:09.783215 1694914 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1124 09:11:09.800817 1694914 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1124 09:11:09.819097 1694914 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1124 09:11:09.836990 1694914 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467.pem --> /usr/share/ca-certificates/1654467.pem (1338 bytes)
	I1124 09:11:09.854588 1694914 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem --> /usr/share/ca-certificates/16544672.pem (1708 bytes)
	I1124 09:11:09.871886 1694914 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1124 09:11:09.893213 1694914 ssh_runner.go:195] Run: openssl version
	I1124 09:11:09.899665 1694914 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/16544672.pem && ln -fs /usr/share/ca-certificates/16544672.pem /etc/ssl/certs/16544672.pem"
	I1124 09:11:09.908243 1694914 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/16544672.pem
	I1124 09:11:09.913038 1694914 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Nov 24 09:10 /usr/share/ca-certificates/16544672.pem
	I1124 09:11:09.913109 1694914 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/16544672.pem
	I1124 09:11:09.957742 1694914 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/16544672.pem /etc/ssl/certs/3ec20f2e.0"
	I1124 09:11:09.966292 1694914 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1124 09:11:09.974661 1694914 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:11:09.978426 1694914 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Nov 24 08:44 /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:11:09.978516 1694914 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:11:10.020402 1694914 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1124 09:11:10.031283 1694914 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1654467.pem && ln -fs /usr/share/ca-certificates/1654467.pem /etc/ssl/certs/1654467.pem"
	I1124 09:11:10.041127 1694914 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1654467.pem
	I1124 09:11:10.045678 1694914 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Nov 24 09:10 /usr/share/ca-certificates/1654467.pem
	I1124 09:11:10.045739 1694914 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1654467.pem
	I1124 09:11:10.087792 1694914 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1654467.pem /etc/ssl/certs/51391683.0"
	I1124 09:11:10.096786 1694914 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1124 09:11:10.100579 1694914 certs.go:400] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I1124 09:11:10.100624 1694914 kubeadm.go:401] StartCluster: {Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikubeCA
APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false
CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:11:10.100700 1694914 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1124 09:11:10.100765 1694914 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1124 09:11:10.130988 1694914 cri.go:89] found id: ""
	I1124 09:11:10.131053 1694914 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1124 09:11:10.139020 1694914 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1124 09:11:10.147207 1694914 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1124 09:11:10.147273 1694914 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1124 09:11:10.156605 1694914 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1124 09:11:10.156614 1694914 kubeadm.go:158] found existing configuration files:
	
	I1124 09:11:10.156668 1694914 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1124 09:11:10.164610 1694914 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1124 09:11:10.164680 1694914 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1124 09:11:10.172153 1694914 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1124 09:11:10.180143 1694914 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1124 09:11:10.180200 1694914 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1124 09:11:10.187889 1694914 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1124 09:11:10.195659 1694914 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1124 09:11:10.195715 1694914 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1124 09:11:10.203100 1694914 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1124 09:11:10.210868 1694914 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1124 09:11:10.210932 1694914 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1124 09:11:10.218472 1694914 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1124 09:11:10.255605 1694914 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-beta.0
	I1124 09:11:10.255857 1694914 kubeadm.go:319] [preflight] Running pre-flight checks
	I1124 09:11:10.323327 1694914 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1124 09:11:10.323409 1694914 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1124 09:11:10.323450 1694914 kubeadm.go:319] OS: Linux
	I1124 09:11:10.323495 1694914 kubeadm.go:319] CGROUPS_CPU: enabled
	I1124 09:11:10.323549 1694914 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1124 09:11:10.323603 1694914 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1124 09:11:10.323657 1694914 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1124 09:11:10.323705 1694914 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1124 09:11:10.323758 1694914 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1124 09:11:10.323822 1694914 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1124 09:11:10.323884 1694914 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1124 09:11:10.323929 1694914 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1124 09:11:10.393993 1694914 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1124 09:11:10.394149 1694914 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1124 09:11:10.394301 1694914 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1124 09:11:12.251125 1694914 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1124 09:11:12.255462 1694914 out.go:252]   - Generating certificates and keys ...
	I1124 09:11:12.255545 1694914 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1124 09:11:12.255609 1694914 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1124 09:11:12.726820 1694914 kubeadm.go:319] [certs] Generating "apiserver-kubelet-client" certificate and key
	I1124 09:11:12.805310 1694914 kubeadm.go:319] [certs] Generating "front-proxy-ca" certificate and key
	I1124 09:11:13.299639 1694914 kubeadm.go:319] [certs] Generating "front-proxy-client" certificate and key
	I1124 09:11:13.636577 1694914 kubeadm.go:319] [certs] Generating "etcd/ca" certificate and key
	I1124 09:11:14.097053 1694914 kubeadm.go:319] [certs] Generating "etcd/server" certificate and key
	I1124 09:11:14.097371 1694914 kubeadm.go:319] [certs] etcd/server serving cert is signed for DNS names [functional-291288 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	I1124 09:11:14.169000 1694914 kubeadm.go:319] [certs] Generating "etcd/peer" certificate and key
	I1124 09:11:14.169303 1694914 kubeadm.go:319] [certs] etcd/peer serving cert is signed for DNS names [functional-291288 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	I1124 09:11:14.450969 1694914 kubeadm.go:319] [certs] Generating "etcd/healthcheck-client" certificate and key
	I1124 09:11:14.822745 1694914 kubeadm.go:319] [certs] Generating "apiserver-etcd-client" certificate and key
	I1124 09:11:15.540994 1694914 kubeadm.go:319] [certs] Generating "sa" key and public key
	I1124 09:11:15.541266 1694914 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1124 09:11:15.664619 1694914 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1124 09:11:15.957602 1694914 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1124 09:11:16.034135 1694914 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1124 09:11:16.282142 1694914 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1124 09:11:16.554937 1694914 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1124 09:11:16.555535 1694914 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1124 09:11:16.558368 1694914 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1124 09:11:16.565915 1694914 out.go:252]   - Booting up control plane ...
	I1124 09:11:16.566029 1694914 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1124 09:11:16.566113 1694914 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1124 09:11:16.566182 1694914 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1124 09:11:16.595167 1694914 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1124 09:11:16.595268 1694914 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1124 09:11:16.602334 1694914 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1124 09:11:16.602803 1694914 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1124 09:11:16.602846 1694914 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1124 09:11:16.738886 1694914 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1124 09:11:16.739011 1694914 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1124 09:15:16.735473 1694914 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.001254358s
	I1124 09:15:16.735508 1694914 kubeadm.go:319] 
	I1124 09:15:16.735575 1694914 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1124 09:15:16.735610 1694914 kubeadm.go:319] 	- The kubelet is not running
	I1124 09:15:16.735713 1694914 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1124 09:15:16.735718 1694914 kubeadm.go:319] 
	I1124 09:15:16.735832 1694914 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1124 09:15:16.735862 1694914 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1124 09:15:16.735900 1694914 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1124 09:15:16.735903 1694914 kubeadm.go:319] 
	I1124 09:15:16.739499 1694914 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1124 09:15:16.739991 1694914 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1124 09:15:16.740100 1694914 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1124 09:15:16.740345 1694914 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1124 09:15:16.740350 1694914 kubeadm.go:319] 
	I1124 09:15:16.740414 1694914 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	W1124 09:15:16.740525 1694914 out.go:285] ! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Generating "apiserver-kubelet-client" certificate and key
	[certs] Generating "front-proxy-ca" certificate and key
	[certs] Generating "front-proxy-client" certificate and key
	[certs] Generating "etcd/ca" certificate and key
	[certs] Generating "etcd/server" certificate and key
	[certs] etcd/server serving cert is signed for DNS names [functional-291288 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	[certs] Generating "etcd/peer" certificate and key
	[certs] etcd/peer serving cert is signed for DNS names [functional-291288 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	[certs] Generating "etcd/healthcheck-client" certificate and key
	[certs] Generating "apiserver-etcd-client" certificate and key
	[certs] Generating "sa" key and public key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001254358s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	I1124 09:15:16.740922 1694914 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1124 09:15:17.147706 1694914 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1124 09:15:17.161333 1694914 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1124 09:15:17.161388 1694914 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1124 09:15:17.169624 1694914 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1124 09:15:17.169633 1694914 kubeadm.go:158] found existing configuration files:
	
	I1124 09:15:17.169687 1694914 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1124 09:15:17.177661 1694914 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1124 09:15:17.177716 1694914 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1124 09:15:17.185118 1694914 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1124 09:15:17.193094 1694914 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1124 09:15:17.193148 1694914 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1124 09:15:17.200550 1694914 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1124 09:15:17.208097 1694914 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1124 09:15:17.208152 1694914 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1124 09:15:17.215654 1694914 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1124 09:15:17.223313 1694914 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1124 09:15:17.223367 1694914 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1124 09:15:17.230595 1694914 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1124 09:15:17.267982 1694914 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-beta.0
	I1124 09:15:17.268068 1694914 kubeadm.go:319] [preflight] Running pre-flight checks
	I1124 09:15:17.337504 1694914 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1124 09:15:17.337569 1694914 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1124 09:15:17.337604 1694914 kubeadm.go:319] OS: Linux
	I1124 09:15:17.337648 1694914 kubeadm.go:319] CGROUPS_CPU: enabled
	I1124 09:15:17.337696 1694914 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1124 09:15:17.337742 1694914 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1124 09:15:17.337789 1694914 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1124 09:15:17.337837 1694914 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1124 09:15:17.337885 1694914 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1124 09:15:17.337929 1694914 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1124 09:15:17.337977 1694914 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1124 09:15:17.338022 1694914 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1124 09:15:17.407308 1694914 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1124 09:15:17.407413 1694914 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1124 09:15:17.407523 1694914 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1124 09:15:17.412511 1694914 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1124 09:15:17.417710 1694914 out.go:252]   - Generating certificates and keys ...
	I1124 09:15:17.417801 1694914 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1124 09:15:17.417866 1694914 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1124 09:15:17.417953 1694914 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1124 09:15:17.418027 1694914 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1124 09:15:17.418135 1694914 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1124 09:15:17.418201 1694914 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1124 09:15:17.418275 1694914 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1124 09:15:17.418342 1694914 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1124 09:15:17.418435 1694914 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1124 09:15:17.418578 1694914 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1124 09:15:17.418616 1694914 kubeadm.go:319] [certs] Using the existing "sa" key
	I1124 09:15:17.418920 1694914 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1124 09:15:17.819615 1694914 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1124 09:15:17.868257 1694914 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1124 09:15:18.451067 1694914 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1124 09:15:18.740921 1694914 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1124 09:15:18.788771 1694914 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1124 09:15:18.789263 1694914 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1124 09:15:18.791887 1694914 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1124 09:15:18.797176 1694914 out.go:252]   - Booting up control plane ...
	I1124 09:15:18.797275 1694914 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1124 09:15:18.797352 1694914 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1124 09:15:18.797420 1694914 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1124 09:15:18.816698 1694914 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1124 09:15:18.816943 1694914 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1124 09:15:18.825721 1694914 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1124 09:15:18.826046 1694914 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1124 09:15:18.826252 1694914 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1124 09:15:18.962915 1694914 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1124 09:15:18.963029 1694914 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1124 09:19:18.960816 1694914 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000282192s
	I1124 09:19:18.960854 1694914 kubeadm.go:319] 
	I1124 09:19:18.960918 1694914 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1124 09:19:18.960950 1694914 kubeadm.go:319] 	- The kubelet is not running
	I1124 09:19:18.961060 1694914 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1124 09:19:18.961063 1694914 kubeadm.go:319] 
	I1124 09:19:18.961166 1694914 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1124 09:19:18.961197 1694914 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1124 09:19:18.961227 1694914 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1124 09:19:18.961230 1694914 kubeadm.go:319] 
	I1124 09:19:18.965938 1694914 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1124 09:19:18.966381 1694914 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1124 09:19:18.966506 1694914 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1124 09:19:18.966743 1694914 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1124 09:19:18.966748 1694914 kubeadm.go:319] 
	I1124 09:19:18.966816 1694914 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1124 09:19:18.966866 1694914 kubeadm.go:403] duration metric: took 8m8.866247724s to StartCluster
	I1124 09:19:18.966902 1694914 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:19:18.966968 1694914 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:19:18.993580 1694914 cri.go:89] found id: ""
	I1124 09:19:18.993608 1694914 logs.go:282] 0 containers: []
	W1124 09:19:18.993616 1694914 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:19:18.993621 1694914 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:19:18.993694 1694914 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:19:19.025741 1694914 cri.go:89] found id: ""
	I1124 09:19:19.025754 1694914 logs.go:282] 0 containers: []
	W1124 09:19:19.025761 1694914 logs.go:284] No container was found matching "etcd"
	I1124 09:19:19.025766 1694914 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:19:19.025826 1694914 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:19:19.051208 1694914 cri.go:89] found id: ""
	I1124 09:19:19.051221 1694914 logs.go:282] 0 containers: []
	W1124 09:19:19.051228 1694914 logs.go:284] No container was found matching "coredns"
	I1124 09:19:19.051234 1694914 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:19:19.051292 1694914 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:19:19.080124 1694914 cri.go:89] found id: ""
	I1124 09:19:19.080139 1694914 logs.go:282] 0 containers: []
	W1124 09:19:19.080146 1694914 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:19:19.080152 1694914 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:19:19.080215 1694914 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:19:19.104890 1694914 cri.go:89] found id: ""
	I1124 09:19:19.104905 1694914 logs.go:282] 0 containers: []
	W1124 09:19:19.104911 1694914 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:19:19.104917 1694914 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:19:19.104973 1694914 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:19:19.128531 1694914 cri.go:89] found id: ""
	I1124 09:19:19.128544 1694914 logs.go:282] 0 containers: []
	W1124 09:19:19.128552 1694914 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:19:19.128557 1694914 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:19:19.128616 1694914 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:19:19.152553 1694914 cri.go:89] found id: ""
	I1124 09:19:19.152566 1694914 logs.go:282] 0 containers: []
	W1124 09:19:19.152573 1694914 logs.go:284] No container was found matching "kindnet"
	I1124 09:19:19.152592 1694914 logs.go:123] Gathering logs for container status ...
	I1124 09:19:19.152603 1694914 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:19:19.184209 1694914 logs.go:123] Gathering logs for kubelet ...
	I1124 09:19:19.184226 1694914 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:19:19.238971 1694914 logs.go:123] Gathering logs for dmesg ...
	I1124 09:19:19.238993 1694914 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:19:19.256182 1694914 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:19:19.256198 1694914 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:19:19.322222 1694914 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:19:19.314221    5436 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:19:19.314962    5436 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:19:19.316606    5436 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:19:19.316916    5436 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:19:19.318426    5436 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:19:19.314221    5436 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:19:19.314962    5436 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:19:19.316606    5436 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:19:19.316916    5436 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:19:19.318426    5436 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:19:19.322242 1694914 logs.go:123] Gathering logs for containerd ...
	I1124 09:19:19.322253 1694914 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	W1124 09:19:19.364700 1694914 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000282192s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	W1124 09:19:19.364759 1694914 out.go:285] * 
	W1124 09:19:19.364884 1694914 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000282192s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1124 09:19:19.364943 1694914 out.go:285] * 
	W1124 09:19:19.367097 1694914 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1124 09:19:19.372870 1694914 out.go:203] 
	W1124 09:19:19.376475 1694914 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000282192s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1124 09:19:19.376577 1694914 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1124 09:19:19.376609 1694914 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1124 09:19:19.380680 1694914 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Nov 24 09:11:04 functional-291288 containerd[765]: time="2025-11-24T09:11:04.504620089Z" level=info msg="ImageCreate event name:\"sha256:1211402d28f5813ed906916bfcdd0a7404c2f9048ef5bb54387a6745bc410eca\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:11:04 functional-291288 containerd[765]: time="2025-11-24T09:11:04.505456685Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/etcd:3.5.24-0\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:11:05 functional-291288 containerd[765]: time="2025-11-24T09:11:05.490391347Z" level=info msg="No images store for sha256:5e4a4fe83792bf529a4e283e09069cf50cc9882d04168a33903ed6809a492e61"
	Nov 24 09:11:05 functional-291288 containerd[765]: time="2025-11-24T09:11:05.492640241Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.13.1\""
	Nov 24 09:11:05 functional-291288 containerd[765]: time="2025-11-24T09:11:05.506238328Z" level=info msg="ImageCreate event name:\"sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:11:05 functional-291288 containerd[765]: time="2025-11-24T09:11:05.506997335Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/coredns/coredns:v1.13.1\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:11:06 functional-291288 containerd[765]: time="2025-11-24T09:11:06.558357270Z" level=info msg="No images store for sha256:64f3fb0a3392f487dbd4300c920f76dc3de2961e11fd6bfbedc75c0d25b1954c"
	Nov 24 09:11:06 functional-291288 containerd[765]: time="2025-11-24T09:11:06.560515324Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.35.0-beta.0\""
	Nov 24 09:11:06 functional-291288 containerd[765]: time="2025-11-24T09:11:06.568659724Z" level=info msg="ImageCreate event name:\"sha256:ccd634d9bcc36ac6235e9c86676cd3a02c06afc3788a25f1bbf39ca7d44585f4\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:11:06 functional-291288 containerd[765]: time="2025-11-24T09:11:06.570120589Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/kube-apiserver:v1.35.0-beta.0\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:11:07 functional-291288 containerd[765]: time="2025-11-24T09:11:07.521945979Z" level=info msg="No images store for sha256:eb9020767c0d3bbd754f3f52cbe4c8bdd935dd5862604d6dc0b1f10422189544"
	Nov 24 09:11:07 functional-291288 containerd[765]: time="2025-11-24T09:11:07.524671177Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.35.0-beta.0\""
	Nov 24 09:11:07 functional-291288 containerd[765]: time="2025-11-24T09:11:07.536458865Z" level=info msg="ImageCreate event name:\"sha256:404c2e12861777b763b8feaa316d36680fc68ad308a8d2f6e55f1bb981cdd904\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:11:07 functional-291288 containerd[765]: time="2025-11-24T09:11:07.536909085Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/kube-proxy:v1.35.0-beta.0\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:11:07 functional-291288 containerd[765]: time="2025-11-24T09:11:07.875397104Z" level=info msg="No images store for sha256:7475c7d18769df89a804d5bebf679dbf94886f3626f07a2be923beaa0cc7e5b0"
	Nov 24 09:11:07 functional-291288 containerd[765]: time="2025-11-24T09:11:07.877932647Z" level=info msg="ImageCreate event name:\"gcr.io/k8s-minikube/storage-provisioner:v5\""
	Nov 24 09:11:07 functional-291288 containerd[765]: time="2025-11-24T09:11:07.886080387Z" level=info msg="ImageCreate event name:\"sha256:66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:11:07 functional-291288 containerd[765]: time="2025-11-24T09:11:07.886485954Z" level=info msg="ImageUpdate event name:\"gcr.io/k8s-minikube/storage-provisioner:v5\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:11:10 functional-291288 containerd[765]: time="2025-11-24T09:11:10.399233088Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.5-0\""
	Nov 24 09:11:12 functional-291288 containerd[765]: time="2025-11-24T09:11:12.234790437Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.5-0\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:11:12 functional-291288 containerd[765]: time="2025-11-24T09:11:12.238101094Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.5-0: active requests=0, bytes read=21124062"
	Nov 24 09:11:12 functional-291288 containerd[765]: time="2025-11-24T09:11:12.240498322Z" level=info msg="ImageCreate event name:\"sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:11:12 functional-291288 containerd[765]: time="2025-11-24T09:11:12.246575942Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:11:12 functional-291288 containerd[765]: time="2025-11-24T09:11:12.247979822Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.5-0\" with image id \"sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42\", repo tag \"registry.k8s.io/etcd:3.6.5-0\", repo digest \"registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534\", size \"21136588\" in 1.848704313s"
	Nov 24 09:11:12 functional-291288 containerd[765]: time="2025-11-24T09:11:12.248027404Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.5-0\" returns image reference \"sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42\""
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:19:20.377424    5546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:19:20.377917    5546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:19:20.379623    5546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:19:20.380176    5546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:19:20.381798    5546 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Nov24 08:20] overlayfs: idmapped layers are currently not supported
	[Nov24 08:21] overlayfs: idmapped layers are currently not supported
	[Nov24 08:22] overlayfs: idmapped layers are currently not supported
	[Nov24 08:23] overlayfs: idmapped layers are currently not supported
	[Nov24 08:24] overlayfs: idmapped layers are currently not supported
	[ +19.566509] overlayfs: idmapped layers are currently not supported
	[Nov24 08:25] overlayfs: idmapped layers are currently not supported
	[ +35.934095] overlayfs: idmapped layers are currently not supported
	[Nov24 08:27] overlayfs: idmapped layers are currently not supported
	[Nov24 08:28] overlayfs: idmapped layers are currently not supported
	[Nov24 08:29] overlayfs: idmapped layers are currently not supported
	[Nov24 08:30] overlayfs: idmapped layers are currently not supported
	[Nov24 08:31] overlayfs: idmapped layers are currently not supported
	[ +44.505180] overlayfs: idmapped layers are currently not supported
	[Nov24 08:32] overlayfs: idmapped layers are currently not supported
	[Nov24 08:33] overlayfs: idmapped layers are currently not supported
	[Nov24 08:34] overlayfs: idmapped layers are currently not supported
	[ +21.137877] overlayfs: idmapped layers are currently not supported
	[ +28.724175] overlayfs: idmapped layers are currently not supported
	[Nov24 08:36] overlayfs: idmapped layers are currently not supported
	[Nov24 08:37] overlayfs: idmapped layers are currently not supported
	[Nov24 08:38] overlayfs: idmapped layers are currently not supported
	[  +4.063834] overlayfs: idmapped layers are currently not supported
	[Nov24 08:40] overlayfs: idmapped layers are currently not supported
	[Nov24 08:42] kauditd_printk_skb: 8 callbacks suppressed
	
	
	==> kernel <==
	 09:19:20 up  8:01,  0 user,  load average: 0.41, 0.25, 0.61
	Linux functional-291288 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Nov 24 09:19:17 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:19:17 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 318.
	Nov 24 09:19:17 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:19:17 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:19:17 functional-291288 kubelet[5353]: E1124 09:19:17.928227    5353 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:19:17 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:19:17 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:19:18 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 319.
	Nov 24 09:19:18 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:19:18 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:19:18 functional-291288 kubelet[5358]: E1124 09:19:18.685064    5358 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:19:18 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:19:18 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:19:19 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 320.
	Nov 24 09:19:19 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:19:19 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:19:19 functional-291288 kubelet[5442]: E1124 09:19:19.454504    5442 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:19:19 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:19:19 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:19:20 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 321.
	Nov 24 09:19:20 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:19:20 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:19:20 functional-291288 kubelet[5495]: E1124 09:19:20.192925    5495 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:19:20 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:19:20 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-291288 -n functional-291288
helpers_test.go:262: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-291288 -n functional-291288: exit status 6 (336.141206ms)

                                                
                                                
-- stdout --
	Stopped
	WARNING: Your kubectl is pointing to stale minikube-vm.
	To fix the kubectl context, run `minikube update-context`

                                                
                                                
-- /stdout --
** stderr ** 
	E1124 09:19:20.849382 1701221 status.go:458] kubeconfig endpoint: get endpoint: "functional-291288" does not appear in /home/jenkins/minikube-integration/21978-1652607/kubeconfig

                                                
                                                
** /stderr **
helpers_test.go:262: status error: exit status 6 (may be ok)
helpers_test.go:264: "functional-291288" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/StartWithProxy (510.27s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/SoftStart (369.25s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/SoftStart
I1124 09:19:20.866727 1654467 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
functional_test.go:674: (dbg) Run:  out/minikube-linux-arm64 start -p functional-291288 --alsologtostderr -v=8
E1124 09:21:03.604801 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:21:24.717205 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
functional_test.go:674: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-291288 --alsologtostderr -v=8: exit status 80 (6m6.539251382s)

                                                
                                                
-- stdout --
	* [functional-291288] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=21978
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/21978-1652607/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/21978-1652607/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the docker driver based on existing profile
	* Starting "functional-291288" primary control-plane node in "functional-291288" cluster
	* Pulling base image v0.0.48-1763789673-21948 ...
	* Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	* Verifying Kubernetes components...
	  - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	* Enabled addons: 
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1124 09:19:20.929895 1701291 out.go:360] Setting OutFile to fd 1 ...
	I1124 09:19:20.930102 1701291 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:19:20.930128 1701291 out.go:374] Setting ErrFile to fd 2...
	I1124 09:19:20.930149 1701291 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:19:20.930488 1701291 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
	I1124 09:19:20.930883 1701291 out.go:368] Setting JSON to false
	I1124 09:19:20.931751 1701291 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":28890,"bootTime":1763947071,"procs":158,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1124 09:19:20.931843 1701291 start.go:143] virtualization:  
	I1124 09:19:20.938521 1701291 out.go:179] * [functional-291288] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1124 09:19:20.941571 1701291 out.go:179]   - MINIKUBE_LOCATION=21978
	I1124 09:19:20.941660 1701291 notify.go:221] Checking for updates...
	I1124 09:19:20.947508 1701291 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1124 09:19:20.950282 1701291 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 09:19:20.953189 1701291 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21978-1652607/.minikube
	I1124 09:19:20.956068 1701291 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1124 09:19:20.958991 1701291 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1124 09:19:20.962273 1701291 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1124 09:19:20.962433 1701291 driver.go:422] Setting default libvirt URI to qemu:///system
	I1124 09:19:20.992476 1701291 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1124 09:19:20.992586 1701291 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 09:19:21.057666 1701291 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-11-24 09:19:21.047762616 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 09:19:21.057787 1701291 docker.go:319] overlay module found
	I1124 09:19:21.060830 1701291 out.go:179] * Using the docker driver based on existing profile
	I1124 09:19:21.063549 1701291 start.go:309] selected driver: docker
	I1124 09:19:21.063567 1701291 start.go:927] validating driver "docker" against &{Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disa
bleCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:19:21.063661 1701291 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1124 09:19:21.063775 1701291 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 09:19:21.121254 1701291 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-11-24 09:19:21.111151392 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 09:19:21.121789 1701291 cni.go:84] Creating CNI manager for ""
	I1124 09:19:21.121863 1701291 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1124 09:19:21.121942 1701291 start.go:353] cluster config:
	{Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local C
ontainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPa
th: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:19:21.125134 1701291 out.go:179] * Starting "functional-291288" primary control-plane node in "functional-291288" cluster
	I1124 09:19:21.127989 1701291 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1124 09:19:21.131005 1701291 out.go:179] * Pulling base image v0.0.48-1763789673-21948 ...
	I1124 09:19:21.133917 1701291 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f in local docker daemon
	I1124 09:19:21.133914 1701291 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1124 09:19:21.154192 1701291 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f in local docker daemon, skipping pull
	I1124 09:19:21.154216 1701291 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f exists in daemon, skipping load
	W1124 09:19:21.197477 1701291 preload.go:144] https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.35.0-beta.0/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	W1124 09:19:21.391690 1701291 preload.go:144] https://github.com/kubernetes-sigs/minikube-preloads/releases/download/v18/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	I1124 09:19:21.391947 1701291 profile.go:143] Saving config to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/config.json ...
	I1124 09:19:21.392070 1701291 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:19:21.392253 1701291 cache.go:243] Successfully downloaded all kic artifacts
	I1124 09:19:21.392304 1701291 start.go:360] acquireMachinesLock for functional-291288: {Name:mk85384dc057570e1f34db593d357cea738652c4 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.392403 1701291 start.go:364] duration metric: took 38.802µs to acquireMachinesLock for "functional-291288"
	I1124 09:19:21.392443 1701291 start.go:96] Skipping create...Using existing machine configuration
	I1124 09:19:21.392463 1701291 fix.go:54] fixHost starting: 
	I1124 09:19:21.392780 1701291 cli_runner.go:164] Run: docker container inspect functional-291288 --format={{.State.Status}}
	I1124 09:19:21.413220 1701291 fix.go:112] recreateIfNeeded on functional-291288: state=Running err=<nil>
	W1124 09:19:21.413254 1701291 fix.go:138] unexpected machine state, will restart: <nil>
	I1124 09:19:21.416439 1701291 out.go:252] * Updating the running docker "functional-291288" container ...
	I1124 09:19:21.416481 1701291 machine.go:94] provisionDockerMachine start ...
	I1124 09:19:21.416565 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:21.444143 1701291 main.go:143] libmachine: Using SSH client type: native
	I1124 09:19:21.444471 1701291 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34684 <nil> <nil>}
	I1124 09:19:21.444480 1701291 main.go:143] libmachine: About to run SSH command:
	hostname
	I1124 09:19:21.581815 1701291 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:19:21.598566 1701291 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-291288
	
	I1124 09:19:21.598592 1701291 ubuntu.go:182] provisioning hostname "functional-291288"
	I1124 09:19:21.598669 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:21.623443 1701291 main.go:143] libmachine: Using SSH client type: native
	I1124 09:19:21.623759 1701291 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34684 <nil> <nil>}
	I1124 09:19:21.623771 1701291 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-291288 && echo "functional-291288" | sudo tee /etc/hostname
	I1124 09:19:21.758572 1701291 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:19:21.799121 1701291 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-291288
	
	I1124 09:19:21.799200 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:21.831127 1701291 main.go:143] libmachine: Using SSH client type: native
	I1124 09:19:21.831435 1701291 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34684 <nil> <nil>}
	I1124 09:19:21.831451 1701291 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-291288' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-291288/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-291288' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1124 09:19:21.919264 1701291 cache.go:107] acquiring lock: {Name:mk22a10f0ce1f3295b61e7e76c455d0494a3e278 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.919300 1701291 cache.go:107] acquiring lock: {Name:mk1cf42e67442503a46c578224bd3cb68bf682d4 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.919365 1701291 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 exists
	I1124 09:19:21.919369 1701291 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 exists
	I1124 09:19:21.919375 1701291 cache.go:96] cache image "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0" took 74.126µs
	I1124 09:19:21.919377 1701291 cache.go:96] cache image "gcr.io/k8s-minikube/storage-provisioner:v5" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5" took 130.274µs
	I1124 09:19:21.919383 1701291 cache.go:80] save to tar file registry.k8s.io/kube-apiserver:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 succeeded
	I1124 09:19:21.919385 1701291 cache.go:80] save to tar file gcr.io/k8s-minikube/storage-provisioner:v5 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 succeeded
	I1124 09:19:21.919395 1701291 cache.go:107] acquiring lock: {Name:mkfdc49c8e68aee34cee0c9d441ae8a4dca675c6 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.919407 1701291 cache.go:107] acquiring lock: {Name:mk85f1502dbb97830776608fb729eb3605e112e6 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.919449 1701291 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 exists
	I1124 09:19:21.919433 1701291 cache.go:107] acquiring lock: {Name:mkdbf38e05e2c47c1a7a906a2236e9e7020a94c5 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.919454 1701291 cache.go:96] cache image "registry.k8s.io/pause:3.10.1" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1" took 48.764µs
	I1124 09:19:21.919460 1701291 cache.go:80] save to tar file registry.k8s.io/pause:3.10.1 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 succeeded
	I1124 09:19:21.919471 1701291 cache.go:107] acquiring lock: {Name:mk46ce3b59d7e062b3dbc8a90fe5b4231f256471 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.919266 1701291 cache.go:107] acquiring lock: {Name:mk80fdbe7cdb5bc17c2a82b4ecfd00214559a435 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.919495 1701291 cache.go:107] acquiring lock: {Name:mk726502cb84c177b2e14fee88512325761511c5 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.919506 1701291 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 exists
	I1124 09:19:21.919511 1701291 cache.go:96] cache image "registry.k8s.io/kube-proxy:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0" took 262.074µs
	I1124 09:19:21.919517 1701291 cache.go:80] save to tar file registry.k8s.io/kube-proxy:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 succeeded
	I1124 09:19:21.919425 1701291 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 exists
	I1124 09:19:21.919525 1701291 cache.go:96] cache image "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0" took 132.661µs
	I1124 09:19:21.919532 1701291 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 exists
	I1124 09:19:21.919476 1701291 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 exists
	I1124 09:19:21.919540 1701291 cache.go:96] cache image "registry.k8s.io/coredns/coredns:v1.13.1" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1" took 48.796µs
	I1124 09:19:21.919547 1701291 cache.go:80] save to tar file registry.k8s.io/coredns/coredns:v1.13.1 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 succeeded
	I1124 09:19:21.919541 1701291 cache.go:96] cache image "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0" took 109.4µs
	I1124 09:19:21.919553 1701291 cache.go:80] save to tar file registry.k8s.io/kube-scheduler:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 succeeded
	I1124 09:19:21.919533 1701291 cache.go:80] save to tar file registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 succeeded
	I1124 09:19:21.919557 1701291 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0 exists
	I1124 09:19:21.919563 1701291 cache.go:96] cache image "registry.k8s.io/etcd:3.5.24-0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0" took 93.482µs
	I1124 09:19:21.919568 1701291 cache.go:80] save to tar file registry.k8s.io/etcd:3.5.24-0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0 succeeded
	I1124 09:19:21.919582 1701291 cache.go:87] Successfully saved all images to host disk.
	I1124 09:19:21.982718 1701291 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1124 09:19:21.982799 1701291 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/21978-1652607/.minikube CaCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/21978-1652607/.minikube}
	I1124 09:19:21.982852 1701291 ubuntu.go:190] setting up certificates
	I1124 09:19:21.982880 1701291 provision.go:84] configureAuth start
	I1124 09:19:21.982954 1701291 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-291288
	I1124 09:19:22.001413 1701291 provision.go:143] copyHostCerts
	I1124 09:19:22.001464 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem
	I1124 09:19:22.001516 1701291 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem, removing ...
	I1124 09:19:22.001530 1701291 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem
	I1124 09:19:22.001614 1701291 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem (1078 bytes)
	I1124 09:19:22.001708 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem
	I1124 09:19:22.001726 1701291 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem, removing ...
	I1124 09:19:22.001731 1701291 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem
	I1124 09:19:22.001757 1701291 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem (1123 bytes)
	I1124 09:19:22.001795 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem
	I1124 09:19:22.001816 1701291 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem, removing ...
	I1124 09:19:22.001820 1701291 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem
	I1124 09:19:22.001845 1701291 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem (1679 bytes)
	I1124 09:19:22.001893 1701291 provision.go:117] generating server cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem org=jenkins.functional-291288 san=[127.0.0.1 192.168.49.2 functional-291288 localhost minikube]
	I1124 09:19:22.129571 1701291 provision.go:177] copyRemoteCerts
	I1124 09:19:22.129639 1701291 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1124 09:19:22.129681 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:22.147944 1701291 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:19:22.254207 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I1124 09:19:22.254271 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1124 09:19:22.271706 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem -> /etc/docker/server.pem
	I1124 09:19:22.271768 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1124 09:19:22.289262 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I1124 09:19:22.289325 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1124 09:19:22.306621 1701291 provision.go:87] duration metric: took 323.706379ms to configureAuth
	I1124 09:19:22.306647 1701291 ubuntu.go:206] setting minikube options for container-runtime
	I1124 09:19:22.306839 1701291 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1124 09:19:22.306847 1701291 machine.go:97] duration metric: took 890.360502ms to provisionDockerMachine
	I1124 09:19:22.306855 1701291 start.go:293] postStartSetup for "functional-291288" (driver="docker")
	I1124 09:19:22.306866 1701291 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1124 09:19:22.306912 1701291 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1124 09:19:22.306953 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:22.324012 1701291 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:19:22.434427 1701291 ssh_runner.go:195] Run: cat /etc/os-release
	I1124 09:19:22.437860 1701291 command_runner.go:130] > PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	I1124 09:19:22.437881 1701291 command_runner.go:130] > NAME="Debian GNU/Linux"
	I1124 09:19:22.437886 1701291 command_runner.go:130] > VERSION_ID="12"
	I1124 09:19:22.437890 1701291 command_runner.go:130] > VERSION="12 (bookworm)"
	I1124 09:19:22.437898 1701291 command_runner.go:130] > VERSION_CODENAME=bookworm
	I1124 09:19:22.437901 1701291 command_runner.go:130] > ID=debian
	I1124 09:19:22.437906 1701291 command_runner.go:130] > HOME_URL="https://www.debian.org/"
	I1124 09:19:22.437910 1701291 command_runner.go:130] > SUPPORT_URL="https://www.debian.org/support"
	I1124 09:19:22.437917 1701291 command_runner.go:130] > BUG_REPORT_URL="https://bugs.debian.org/"
	I1124 09:19:22.437980 1701291 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1124 09:19:22.437995 1701291 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1124 09:19:22.438006 1701291 filesync.go:126] Scanning /home/jenkins/minikube-integration/21978-1652607/.minikube/addons for local assets ...
	I1124 09:19:22.438064 1701291 filesync.go:126] Scanning /home/jenkins/minikube-integration/21978-1652607/.minikube/files for local assets ...
	I1124 09:19:22.438143 1701291 filesync.go:149] local asset: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem -> 16544672.pem in /etc/ssl/certs
	I1124 09:19:22.438150 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem -> /etc/ssl/certs/16544672.pem
	I1124 09:19:22.438232 1701291 filesync.go:149] local asset: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/test/nested/copy/1654467/hosts -> hosts in /etc/test/nested/copy/1654467
	I1124 09:19:22.438236 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/test/nested/copy/1654467/hosts -> /etc/test/nested/copy/1654467/hosts
	I1124 09:19:22.438277 1701291 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/1654467
	I1124 09:19:22.446265 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem --> /etc/ssl/certs/16544672.pem (1708 bytes)
	I1124 09:19:22.463769 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/test/nested/copy/1654467/hosts --> /etc/test/nested/copy/1654467/hosts (40 bytes)
	I1124 09:19:22.481365 1701291 start.go:296] duration metric: took 174.495413ms for postStartSetup
	I1124 09:19:22.481446 1701291 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1124 09:19:22.481495 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:22.498552 1701291 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:19:22.598952 1701291 command_runner.go:130] > 14%
	I1124 09:19:22.599551 1701291 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1124 09:19:22.604050 1701291 command_runner.go:130] > 168G
	I1124 09:19:22.604631 1701291 fix.go:56] duration metric: took 1.212164413s for fixHost
	I1124 09:19:22.604655 1701291 start.go:83] releasing machines lock for "functional-291288", held for 1.212220037s
	I1124 09:19:22.604753 1701291 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-291288
	I1124 09:19:22.621885 1701291 ssh_runner.go:195] Run: cat /version.json
	I1124 09:19:22.621944 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:22.622207 1701291 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1124 09:19:22.622270 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:22.640397 1701291 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:19:22.648463 1701291 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:19:22.746016 1701291 command_runner.go:130] > {"iso_version": "v1.37.0-1763503576-21924", "kicbase_version": "v0.0.48-1763789673-21948", "minikube_version": "v1.37.0", "commit": "2996c7ec74d570fa8ab37e6f4f8813150d0c7473"}
	I1124 09:19:22.746158 1701291 ssh_runner.go:195] Run: systemctl --version
	I1124 09:19:22.840219 1701291 command_runner.go:130] > <a href="https://github.com/kubernetes/registry.k8s.io">Temporary Redirect</a>.
	I1124 09:19:22.840264 1701291 command_runner.go:130] > systemd 252 (252.39-1~deb12u1)
	I1124 09:19:22.840285 1701291 command_runner.go:130] > +PAM +AUDIT +SELINUX +APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT +QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified
	I1124 09:19:22.840354 1701291 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I1124 09:19:22.844675 1701291 command_runner.go:130] ! stat: cannot statx '/etc/cni/net.d/*loopback.conf*': No such file or directory
	W1124 09:19:22.844725 1701291 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1124 09:19:22.844793 1701291 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1124 09:19:22.852461 1701291 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1124 09:19:22.852484 1701291 start.go:496] detecting cgroup driver to use...
	I1124 09:19:22.852517 1701291 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1124 09:19:22.852584 1701291 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1124 09:19:22.868240 1701291 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1124 09:19:22.881367 1701291 docker.go:218] disabling cri-docker service (if available) ...
	I1124 09:19:22.881470 1701291 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1124 09:19:22.896889 1701291 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1124 09:19:22.910017 1701291 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1124 09:19:23.028071 1701291 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1124 09:19:23.171419 1701291 docker.go:234] disabling docker service ...
	I1124 09:19:23.171539 1701291 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1124 09:19:23.187505 1701291 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1124 09:19:23.201405 1701291 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1124 09:19:23.324426 1701291 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1124 09:19:23.445186 1701291 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1124 09:19:23.457903 1701291 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1124 09:19:23.470553 1701291 command_runner.go:130] > runtime-endpoint: unix:///run/containerd/containerd.sock
	I1124 09:19:23.472034 1701291 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:19:23.623898 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1124 09:19:23.632988 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1124 09:19:23.641976 1701291 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1124 09:19:23.642063 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1124 09:19:23.651244 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1124 09:19:23.660198 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1124 09:19:23.668706 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1124 09:19:23.677261 1701291 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1124 09:19:23.685600 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1124 09:19:23.694593 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1124 09:19:23.703191 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1124 09:19:23.712006 1701291 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1124 09:19:23.718640 1701291 command_runner.go:130] > net.bridge.bridge-nf-call-iptables = 1
	I1124 09:19:23.719691 1701291 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1124 09:19:23.727172 1701291 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1124 09:19:23.844539 1701291 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1124 09:19:23.964625 1701291 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1124 09:19:23.964708 1701291 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1124 09:19:23.969624 1701291 command_runner.go:130] >   File: /run/containerd/containerd.sock
	I1124 09:19:23.969648 1701291 command_runner.go:130] >   Size: 0         	Blocks: 0          IO Block: 4096   socket
	I1124 09:19:23.969655 1701291 command_runner.go:130] > Device: 0,72	Inode: 1619        Links: 1
	I1124 09:19:23.969671 1701291 command_runner.go:130] > Access: (0660/srw-rw----)  Uid: (    0/    root)   Gid: (    0/    root)
	I1124 09:19:23.969685 1701291 command_runner.go:130] > Access: 2025-11-24 09:19:23.931843190 +0000
	I1124 09:19:23.969693 1701291 command_runner.go:130] > Modify: 2025-11-24 09:19:23.931843190 +0000
	I1124 09:19:23.969699 1701291 command_runner.go:130] > Change: 2025-11-24 09:19:23.931843190 +0000
	I1124 09:19:23.969707 1701291 command_runner.go:130] >  Birth: -
	I1124 09:19:23.970283 1701291 start.go:564] Will wait 60s for crictl version
	I1124 09:19:23.970345 1701291 ssh_runner.go:195] Run: which crictl
	I1124 09:19:23.973724 1701291 command_runner.go:130] > /usr/local/bin/crictl
	I1124 09:19:23.974288 1701291 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1124 09:19:23.995301 1701291 command_runner.go:130] > Version:  0.1.0
	I1124 09:19:23.995587 1701291 command_runner.go:130] > RuntimeName:  containerd
	I1124 09:19:23.995841 1701291 command_runner.go:130] > RuntimeVersion:  v2.1.5
	I1124 09:19:23.996049 1701291 command_runner.go:130] > RuntimeApiVersion:  v1
	I1124 09:19:23.998158 1701291 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.1.5
	RuntimeApiVersion:  v1
	I1124 09:19:23.998238 1701291 ssh_runner.go:195] Run: containerd --version
	I1124 09:19:24.020107 1701291 command_runner.go:130] > containerd containerd.io v2.1.5 fcd43222d6b07379a4be9786bda52438f0dd16a1
	I1124 09:19:24.020449 1701291 ssh_runner.go:195] Run: containerd --version
	I1124 09:19:24.041776 1701291 command_runner.go:130] > containerd containerd.io v2.1.5 fcd43222d6b07379a4be9786bda52438f0dd16a1
	I1124 09:19:24.047417 1701291 out.go:179] * Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	I1124 09:19:24.050497 1701291 cli_runner.go:164] Run: docker network inspect functional-291288 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1124 09:19:24.067531 1701291 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1124 09:19:24.071507 1701291 command_runner.go:130] > 192.168.49.1	host.minikube.internal
	I1124 09:19:24.071622 1701291 kubeadm.go:884] updating cluster {Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikub
eCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:fal
se CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1124 09:19:24.071797 1701291 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:19:24.253230 1701291 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:19:24.402285 1701291 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:19:24.552419 1701291 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1124 09:19:24.552515 1701291 ssh_runner.go:195] Run: sudo crictl images --output json
	I1124 09:19:24.577200 1701291 command_runner.go:130] > {
	I1124 09:19:24.577221 1701291 command_runner.go:130] >   "images":  [
	I1124 09:19:24.577226 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577235 1701291 command_runner.go:130] >       "id":  "sha256:66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51",
	I1124 09:19:24.577240 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577245 1701291 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1124 09:19:24.577248 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577252 1701291 command_runner.go:130] >       "repoDigests":  [],
	I1124 09:19:24.577256 1701291 command_runner.go:130] >       "size":  "8032639",
	I1124 09:19:24.577264 1701291 command_runner.go:130] >       "username":  "",
	I1124 09:19:24.577269 1701291 command_runner.go:130] >       "pinned":  false
	I1124 09:19:24.577272 1701291 command_runner.go:130] >     },
	I1124 09:19:24.577276 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577283 1701291 command_runner.go:130] >       "id":  "sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1124 09:19:24.577290 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577296 1701291 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1124 09:19:24.577299 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577308 1701291 command_runner.go:130] >       "repoDigests":  [],
	I1124 09:19:24.577330 1701291 command_runner.go:130] >       "size":  "21166088",
	I1124 09:19:24.577335 1701291 command_runner.go:130] >       "username":  "nonroot",
	I1124 09:19:24.577339 1701291 command_runner.go:130] >       "pinned":  false
	I1124 09:19:24.577349 1701291 command_runner.go:130] >     },
	I1124 09:19:24.577357 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577364 1701291 command_runner.go:130] >       "id":  "sha256:1211402d28f5813ed906916bfcdd0a7404c2f9048ef5bb54387a6745bc410eca",
	I1124 09:19:24.577368 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577373 1701291 command_runner.go:130] >         "registry.k8s.io/etcd:3.5.24-0"
	I1124 09:19:24.577376 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577380 1701291 command_runner.go:130] >       "repoDigests":  [],
	I1124 09:19:24.577384 1701291 command_runner.go:130] >       "size":  "21880804",
	I1124 09:19:24.577391 1701291 command_runner.go:130] >       "uid":  {
	I1124 09:19:24.577395 1701291 command_runner.go:130] >         "value":  "0"
	I1124 09:19:24.577400 1701291 command_runner.go:130] >       },
	I1124 09:19:24.577404 1701291 command_runner.go:130] >       "username":  "",
	I1124 09:19:24.577408 1701291 command_runner.go:130] >       "pinned":  false
	I1124 09:19:24.577421 1701291 command_runner.go:130] >     },
	I1124 09:19:24.577424 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577431 1701291 command_runner.go:130] >       "id":  "sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42",
	I1124 09:19:24.577434 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577443 1701291 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.5-0"
	I1124 09:19:24.577450 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577454 1701291 command_runner.go:130] >       "repoDigests":  [
	I1124 09:19:24.577461 1701291 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534"
	I1124 09:19:24.577465 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577469 1701291 command_runner.go:130] >       "size":  "21136588",
	I1124 09:19:24.577472 1701291 command_runner.go:130] >       "uid":  {
	I1124 09:19:24.577479 1701291 command_runner.go:130] >         "value":  "0"
	I1124 09:19:24.577482 1701291 command_runner.go:130] >       },
	I1124 09:19:24.577486 1701291 command_runner.go:130] >       "username":  "",
	I1124 09:19:24.577492 1701291 command_runner.go:130] >       "pinned":  false
	I1124 09:19:24.577495 1701291 command_runner.go:130] >     },
	I1124 09:19:24.577502 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577512 1701291 command_runner.go:130] >       "id":  "sha256:ccd634d9bcc36ac6235e9c86676cd3a02c06afc3788a25f1bbf39ca7d44585f4",
	I1124 09:19:24.577516 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577521 1701291 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-beta.0"
	I1124 09:19:24.577527 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577531 1701291 command_runner.go:130] >       "repoDigests":  [],
	I1124 09:19:24.577535 1701291 command_runner.go:130] >       "size":  "24676285",
	I1124 09:19:24.577538 1701291 command_runner.go:130] >       "uid":  {
	I1124 09:19:24.577541 1701291 command_runner.go:130] >         "value":  "0"
	I1124 09:19:24.577545 1701291 command_runner.go:130] >       },
	I1124 09:19:24.577550 1701291 command_runner.go:130] >       "username":  "",
	I1124 09:19:24.577556 1701291 command_runner.go:130] >       "pinned":  false
	I1124 09:19:24.577560 1701291 command_runner.go:130] >     },
	I1124 09:19:24.577563 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577569 1701291 command_runner.go:130] >       "id":  "sha256:68b5f775f18769fcb77bd8474c80bda2050163b6c66f4551f352b7381b8ca5be",
	I1124 09:19:24.577581 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577586 1701291 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0"
	I1124 09:19:24.577590 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577594 1701291 command_runner.go:130] >       "repoDigests":  [],
	I1124 09:19:24.577605 1701291 command_runner.go:130] >       "size":  "20658969",
	I1124 09:19:24.577608 1701291 command_runner.go:130] >       "uid":  {
	I1124 09:19:24.577612 1701291 command_runner.go:130] >         "value":  "0"
	I1124 09:19:24.577615 1701291 command_runner.go:130] >       },
	I1124 09:19:24.577619 1701291 command_runner.go:130] >       "username":  "",
	I1124 09:19:24.577624 1701291 command_runner.go:130] >       "pinned":  false
	I1124 09:19:24.577629 1701291 command_runner.go:130] >     },
	I1124 09:19:24.577633 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577644 1701291 command_runner.go:130] >       "id":  "sha256:404c2e12861777b763b8feaa316d36680fc68ad308a8d2f6e55f1bb981cdd904",
	I1124 09:19:24.577655 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577660 1701291 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-beta.0"
	I1124 09:19:24.577663 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577667 1701291 command_runner.go:130] >       "repoDigests":  [],
	I1124 09:19:24.577678 1701291 command_runner.go:130] >       "size":  "22428165",
	I1124 09:19:24.577686 1701291 command_runner.go:130] >       "username":  "",
	I1124 09:19:24.577692 1701291 command_runner.go:130] >       "pinned":  false
	I1124 09:19:24.577696 1701291 command_runner.go:130] >     },
	I1124 09:19:24.577706 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577712 1701291 command_runner.go:130] >       "id":  "sha256:16378741539f1be9c6e347d127537d379a6592587b09b4eb47964cb5c43a409b",
	I1124 09:19:24.577716 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577721 1701291 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-beta.0"
	I1124 09:19:24.577724 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577728 1701291 command_runner.go:130] >       "repoDigests":  [],
	I1124 09:19:24.577738 1701291 command_runner.go:130] >       "size":  "15389290",
	I1124 09:19:24.577744 1701291 command_runner.go:130] >       "uid":  {
	I1124 09:19:24.577751 1701291 command_runner.go:130] >         "value":  "0"
	I1124 09:19:24.577754 1701291 command_runner.go:130] >       },
	I1124 09:19:24.577758 1701291 command_runner.go:130] >       "username":  "",
	I1124 09:19:24.577762 1701291 command_runner.go:130] >       "pinned":  false
	I1124 09:19:24.577768 1701291 command_runner.go:130] >     },
	I1124 09:19:24.577771 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577779 1701291 command_runner.go:130] >       "id":  "sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1124 09:19:24.577786 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577791 1701291 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1124 09:19:24.577794 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577797 1701291 command_runner.go:130] >       "repoDigests":  [],
	I1124 09:19:24.577801 1701291 command_runner.go:130] >       "size":  "265458",
	I1124 09:19:24.577805 1701291 command_runner.go:130] >       "uid":  {
	I1124 09:19:24.577809 1701291 command_runner.go:130] >         "value":  "65535"
	I1124 09:19:24.577815 1701291 command_runner.go:130] >       },
	I1124 09:19:24.577819 1701291 command_runner.go:130] >       "username":  "",
	I1124 09:19:24.577824 1701291 command_runner.go:130] >       "pinned":  true
	I1124 09:19:24.577827 1701291 command_runner.go:130] >     }
	I1124 09:19:24.577831 1701291 command_runner.go:130] >   ]
	I1124 09:19:24.577842 1701291 command_runner.go:130] > }
	I1124 09:19:24.577988 1701291 containerd.go:627] all images are preloaded for containerd runtime.
	I1124 09:19:24.578000 1701291 cache_images.go:86] Images are preloaded, skipping loading
	I1124 09:19:24.578012 1701291 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-beta.0 containerd true true} ...
	I1124 09:19:24.578111 1701291 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-291288 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1124 09:19:24.578176 1701291 ssh_runner.go:195] Run: sudo crictl info
	I1124 09:19:24.601872 1701291 command_runner.go:130] > {
	I1124 09:19:24.601895 1701291 command_runner.go:130] >   "cniconfig": {
	I1124 09:19:24.601901 1701291 command_runner.go:130] >     "Networks": [
	I1124 09:19:24.601905 1701291 command_runner.go:130] >       {
	I1124 09:19:24.601909 1701291 command_runner.go:130] >         "Config": {
	I1124 09:19:24.601914 1701291 command_runner.go:130] >           "CNIVersion": "0.3.1",
	I1124 09:19:24.601919 1701291 command_runner.go:130] >           "Name": "cni-loopback",
	I1124 09:19:24.601924 1701291 command_runner.go:130] >           "Plugins": [
	I1124 09:19:24.601927 1701291 command_runner.go:130] >             {
	I1124 09:19:24.601931 1701291 command_runner.go:130] >               "Network": {
	I1124 09:19:24.601935 1701291 command_runner.go:130] >                 "ipam": {},
	I1124 09:19:24.601941 1701291 command_runner.go:130] >                 "type": "loopback"
	I1124 09:19:24.601945 1701291 command_runner.go:130] >               },
	I1124 09:19:24.601958 1701291 command_runner.go:130] >               "Source": "{\"type\":\"loopback\"}"
	I1124 09:19:24.601965 1701291 command_runner.go:130] >             }
	I1124 09:19:24.601969 1701291 command_runner.go:130] >           ],
	I1124 09:19:24.601979 1701291 command_runner.go:130] >           "Source": "{\n\"cniVersion\": \"0.3.1\",\n\"name\": \"cni-loopback\",\n\"plugins\": [{\n  \"type\": \"loopback\"\n}]\n}"
	I1124 09:19:24.601983 1701291 command_runner.go:130] >         },
	I1124 09:19:24.601991 1701291 command_runner.go:130] >         "IFName": "lo"
	I1124 09:19:24.601994 1701291 command_runner.go:130] >       }
	I1124 09:19:24.601997 1701291 command_runner.go:130] >     ],
	I1124 09:19:24.602003 1701291 command_runner.go:130] >     "PluginConfDir": "/etc/cni/net.d",
	I1124 09:19:24.602007 1701291 command_runner.go:130] >     "PluginDirs": [
	I1124 09:19:24.602014 1701291 command_runner.go:130] >       "/opt/cni/bin"
	I1124 09:19:24.602018 1701291 command_runner.go:130] >     ],
	I1124 09:19:24.602026 1701291 command_runner.go:130] >     "PluginMaxConfNum": 1,
	I1124 09:19:24.602030 1701291 command_runner.go:130] >     "Prefix": "eth"
	I1124 09:19:24.602033 1701291 command_runner.go:130] >   },
	I1124 09:19:24.602037 1701291 command_runner.go:130] >   "config": {
	I1124 09:19:24.602041 1701291 command_runner.go:130] >     "cdiSpecDirs": [
	I1124 09:19:24.602048 1701291 command_runner.go:130] >       "/etc/cdi",
	I1124 09:19:24.602051 1701291 command_runner.go:130] >       "/var/run/cdi"
	I1124 09:19:24.602055 1701291 command_runner.go:130] >     ],
	I1124 09:19:24.602069 1701291 command_runner.go:130] >     "cni": {
	I1124 09:19:24.602073 1701291 command_runner.go:130] >       "binDir": "",
	I1124 09:19:24.602076 1701291 command_runner.go:130] >       "binDirs": [
	I1124 09:19:24.602080 1701291 command_runner.go:130] >         "/opt/cni/bin"
	I1124 09:19:24.602083 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.602087 1701291 command_runner.go:130] >       "confDir": "/etc/cni/net.d",
	I1124 09:19:24.602092 1701291 command_runner.go:130] >       "confTemplate": "",
	I1124 09:19:24.602098 1701291 command_runner.go:130] >       "ipPref": "",
	I1124 09:19:24.602103 1701291 command_runner.go:130] >       "maxConfNum": 1,
	I1124 09:19:24.602109 1701291 command_runner.go:130] >       "setupSerially": false,
	I1124 09:19:24.602114 1701291 command_runner.go:130] >       "useInternalLoopback": false
	I1124 09:19:24.602120 1701291 command_runner.go:130] >     },
	I1124 09:19:24.602126 1701291 command_runner.go:130] >     "containerd": {
	I1124 09:19:24.602132 1701291 command_runner.go:130] >       "defaultRuntimeName": "runc",
	I1124 09:19:24.602137 1701291 command_runner.go:130] >       "ignoreBlockIONotEnabledErrors": false,
	I1124 09:19:24.602145 1701291 command_runner.go:130] >       "ignoreRdtNotEnabledErrors": false,
	I1124 09:19:24.602149 1701291 command_runner.go:130] >       "runtimes": {
	I1124 09:19:24.602152 1701291 command_runner.go:130] >         "runc": {
	I1124 09:19:24.602157 1701291 command_runner.go:130] >           "ContainerAnnotations": null,
	I1124 09:19:24.602163 1701291 command_runner.go:130] >           "PodAnnotations": null,
	I1124 09:19:24.602169 1701291 command_runner.go:130] >           "baseRuntimeSpec": "",
	I1124 09:19:24.602174 1701291 command_runner.go:130] >           "cgroupWritable": false,
	I1124 09:19:24.602179 1701291 command_runner.go:130] >           "cniConfDir": "",
	I1124 09:19:24.602185 1701291 command_runner.go:130] >           "cniMaxConfNum": 0,
	I1124 09:19:24.602190 1701291 command_runner.go:130] >           "io_type": "",
	I1124 09:19:24.602195 1701291 command_runner.go:130] >           "options": {
	I1124 09:19:24.602200 1701291 command_runner.go:130] >             "BinaryName": "",
	I1124 09:19:24.602212 1701291 command_runner.go:130] >             "CriuImagePath": "",
	I1124 09:19:24.602217 1701291 command_runner.go:130] >             "CriuWorkPath": "",
	I1124 09:19:24.602221 1701291 command_runner.go:130] >             "IoGid": 0,
	I1124 09:19:24.602226 1701291 command_runner.go:130] >             "IoUid": 0,
	I1124 09:19:24.602232 1701291 command_runner.go:130] >             "NoNewKeyring": false,
	I1124 09:19:24.602237 1701291 command_runner.go:130] >             "Root": "",
	I1124 09:19:24.602243 1701291 command_runner.go:130] >             "ShimCgroup": "",
	I1124 09:19:24.602248 1701291 command_runner.go:130] >             "SystemdCgroup": false
	I1124 09:19:24.602252 1701291 command_runner.go:130] >           },
	I1124 09:19:24.602257 1701291 command_runner.go:130] >           "privileged_without_host_devices": false,
	I1124 09:19:24.602266 1701291 command_runner.go:130] >           "privileged_without_host_devices_all_devices_allowed": false,
	I1124 09:19:24.602272 1701291 command_runner.go:130] >           "runtimePath": "",
	I1124 09:19:24.602278 1701291 command_runner.go:130] >           "runtimeType": "io.containerd.runc.v2",
	I1124 09:19:24.602285 1701291 command_runner.go:130] >           "sandboxer": "podsandbox",
	I1124 09:19:24.602290 1701291 command_runner.go:130] >           "snapshotter": ""
	I1124 09:19:24.602293 1701291 command_runner.go:130] >         }
	I1124 09:19:24.602296 1701291 command_runner.go:130] >       }
	I1124 09:19:24.602299 1701291 command_runner.go:130] >     },
	I1124 09:19:24.602309 1701291 command_runner.go:130] >     "containerdEndpoint": "/run/containerd/containerd.sock",
	I1124 09:19:24.602332 1701291 command_runner.go:130] >     "containerdRootDir": "/var/lib/containerd",
	I1124 09:19:24.602339 1701291 command_runner.go:130] >     "device_ownership_from_security_context": false,
	I1124 09:19:24.602344 1701291 command_runner.go:130] >     "disableApparmor": false,
	I1124 09:19:24.602351 1701291 command_runner.go:130] >     "disableHugetlbController": true,
	I1124 09:19:24.602355 1701291 command_runner.go:130] >     "disableProcMount": false,
	I1124 09:19:24.602362 1701291 command_runner.go:130] >     "drainExecSyncIOTimeout": "0s",
	I1124 09:19:24.602366 1701291 command_runner.go:130] >     "enableCDI": true,
	I1124 09:19:24.602378 1701291 command_runner.go:130] >     "enableSelinux": false,
	I1124 09:19:24.602382 1701291 command_runner.go:130] >     "enableUnprivilegedICMP": true,
	I1124 09:19:24.602387 1701291 command_runner.go:130] >     "enableUnprivilegedPorts": true,
	I1124 09:19:24.602392 1701291 command_runner.go:130] >     "ignoreDeprecationWarnings": null,
	I1124 09:19:24.602403 1701291 command_runner.go:130] >     "ignoreImageDefinedVolumes": false,
	I1124 09:19:24.602408 1701291 command_runner.go:130] >     "maxContainerLogLineSize": 16384,
	I1124 09:19:24.602413 1701291 command_runner.go:130] >     "netnsMountsUnderStateDir": false,
	I1124 09:19:24.602417 1701291 command_runner.go:130] >     "restrictOOMScoreAdj": false,
	I1124 09:19:24.602422 1701291 command_runner.go:130] >     "rootDir": "/var/lib/containerd/io.containerd.grpc.v1.cri",
	I1124 09:19:24.602427 1701291 command_runner.go:130] >     "selinuxCategoryRange": 1024,
	I1124 09:19:24.602432 1701291 command_runner.go:130] >     "stateDir": "/run/containerd/io.containerd.grpc.v1.cri",
	I1124 09:19:24.602438 1701291 command_runner.go:130] >     "tolerateMissingHugetlbController": true,
	I1124 09:19:24.602441 1701291 command_runner.go:130] >     "unsetSeccompProfile": ""
	I1124 09:19:24.602445 1701291 command_runner.go:130] >   },
	I1124 09:19:24.602449 1701291 command_runner.go:130] >   "features": {
	I1124 09:19:24.602492 1701291 command_runner.go:130] >     "supplemental_groups_policy": true
	I1124 09:19:24.602500 1701291 command_runner.go:130] >   },
	I1124 09:19:24.602504 1701291 command_runner.go:130] >   "golang": "go1.24.9",
	I1124 09:19:24.602513 1701291 command_runner.go:130] >   "lastCNILoadStatus": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1124 09:19:24.602527 1701291 command_runner.go:130] >   "lastCNILoadStatus.default": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1124 09:19:24.602532 1701291 command_runner.go:130] >   "runtimeHandlers": [
	I1124 09:19:24.602537 1701291 command_runner.go:130] >     {
	I1124 09:19:24.602541 1701291 command_runner.go:130] >       "features": {
	I1124 09:19:24.602546 1701291 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1124 09:19:24.602550 1701291 command_runner.go:130] >         "user_namespaces": true
	I1124 09:19:24.602555 1701291 command_runner.go:130] >       }
	I1124 09:19:24.602564 1701291 command_runner.go:130] >     },
	I1124 09:19:24.602570 1701291 command_runner.go:130] >     {
	I1124 09:19:24.602575 1701291 command_runner.go:130] >       "features": {
	I1124 09:19:24.602587 1701291 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1124 09:19:24.602592 1701291 command_runner.go:130] >         "user_namespaces": true
	I1124 09:19:24.602595 1701291 command_runner.go:130] >       },
	I1124 09:19:24.602598 1701291 command_runner.go:130] >       "name": "runc"
	I1124 09:19:24.602609 1701291 command_runner.go:130] >     }
	I1124 09:19:24.602612 1701291 command_runner.go:130] >   ],
	I1124 09:19:24.602615 1701291 command_runner.go:130] >   "status": {
	I1124 09:19:24.602619 1701291 command_runner.go:130] >     "conditions": [
	I1124 09:19:24.602623 1701291 command_runner.go:130] >       {
	I1124 09:19:24.602629 1701291 command_runner.go:130] >         "message": "",
	I1124 09:19:24.602633 1701291 command_runner.go:130] >         "reason": "",
	I1124 09:19:24.602637 1701291 command_runner.go:130] >         "status": true,
	I1124 09:19:24.602641 1701291 command_runner.go:130] >         "type": "RuntimeReady"
	I1124 09:19:24.602645 1701291 command_runner.go:130] >       },
	I1124 09:19:24.602648 1701291 command_runner.go:130] >       {
	I1124 09:19:24.602655 1701291 command_runner.go:130] >         "message": "Network plugin returns error: cni plugin not initialized",
	I1124 09:19:24.602662 1701291 command_runner.go:130] >         "reason": "NetworkPluginNotReady",
	I1124 09:19:24.602666 1701291 command_runner.go:130] >         "status": false,
	I1124 09:19:24.602678 1701291 command_runner.go:130] >         "type": "NetworkReady"
	I1124 09:19:24.602682 1701291 command_runner.go:130] >       },
	I1124 09:19:24.602685 1701291 command_runner.go:130] >       {
	I1124 09:19:24.602688 1701291 command_runner.go:130] >         "message": "",
	I1124 09:19:24.602692 1701291 command_runner.go:130] >         "reason": "",
	I1124 09:19:24.602703 1701291 command_runner.go:130] >         "status": true,
	I1124 09:19:24.602709 1701291 command_runner.go:130] >         "type": "ContainerdHasNoDeprecationWarnings"
	I1124 09:19:24.602712 1701291 command_runner.go:130] >       }
	I1124 09:19:24.602715 1701291 command_runner.go:130] >     ]
	I1124 09:19:24.602718 1701291 command_runner.go:130] >   }
	I1124 09:19:24.602721 1701291 command_runner.go:130] > }
	I1124 09:19:24.603033 1701291 cni.go:84] Creating CNI manager for ""
	I1124 09:19:24.603051 1701291 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1124 09:19:24.603074 1701291 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1124 09:19:24.603102 1701291 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-beta.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-291288 NodeName:functional-291288 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt Sta
ticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1124 09:19:24.603228 1701291 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-291288"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-beta.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1124 09:19:24.603309 1701291 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0
	I1124 09:19:24.611119 1701291 command_runner.go:130] > kubeadm
	I1124 09:19:24.611140 1701291 command_runner.go:130] > kubectl
	I1124 09:19:24.611146 1701291 command_runner.go:130] > kubelet
	I1124 09:19:24.611161 1701291 binaries.go:51] Found k8s binaries, skipping transfer
	I1124 09:19:24.611223 1701291 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1124 09:19:24.618883 1701291 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (328 bytes)
	I1124 09:19:24.633448 1701291 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (359 bytes)
	I1124 09:19:24.650072 1701291 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2237 bytes)
	I1124 09:19:24.664688 1701291 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1124 09:19:24.668362 1701291 command_runner.go:130] > 192.168.49.2	control-plane.minikube.internal
	I1124 09:19:24.668996 1701291 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1124 09:19:24.787731 1701291 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1124 09:19:25.630718 1701291 certs.go:69] Setting up /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288 for IP: 192.168.49.2
	I1124 09:19:25.630736 1701291 certs.go:195] generating shared ca certs ...
	I1124 09:19:25.630751 1701291 certs.go:227] acquiring lock for ca certs: {Name:mkbe540a30c4376a351176f7fe6fec044d058b09 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 09:19:25.630878 1701291 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.key
	I1124 09:19:25.630932 1701291 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.key
	I1124 09:19:25.630939 1701291 certs.go:257] generating profile certs ...
	I1124 09:19:25.631060 1701291 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.key
	I1124 09:19:25.631119 1701291 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.key.5acb2515
	I1124 09:19:25.631156 1701291 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.key
	I1124 09:19:25.631166 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I1124 09:19:25.631180 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I1124 09:19:25.631190 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I1124 09:19:25.631200 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I1124 09:19:25.631210 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I1124 09:19:25.631221 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I1124 09:19:25.631231 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I1124 09:19:25.631241 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I1124 09:19:25.631304 1701291 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467.pem (1338 bytes)
	W1124 09:19:25.631338 1701291 certs.go:480] ignoring /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467_empty.pem, impossibly tiny 0 bytes
	I1124 09:19:25.631352 1701291 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem (1671 bytes)
	I1124 09:19:25.631382 1701291 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem (1078 bytes)
	I1124 09:19:25.631410 1701291 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem (1123 bytes)
	I1124 09:19:25.631434 1701291 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem (1679 bytes)
	I1124 09:19:25.631484 1701291 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem (1708 bytes)
	I1124 09:19:25.631512 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:19:25.631529 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467.pem -> /usr/share/ca-certificates/1654467.pem
	I1124 09:19:25.631542 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem -> /usr/share/ca-certificates/16544672.pem
	I1124 09:19:25.632117 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1124 09:19:25.653566 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1124 09:19:25.672677 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1124 09:19:25.692448 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1124 09:19:25.712758 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1124 09:19:25.730246 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1124 09:19:25.748136 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1124 09:19:25.765102 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1124 09:19:25.782676 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1124 09:19:25.800418 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467.pem --> /usr/share/ca-certificates/1654467.pem (1338 bytes)
	I1124 09:19:25.818179 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem --> /usr/share/ca-certificates/16544672.pem (1708 bytes)
	I1124 09:19:25.836420 1701291 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1124 09:19:25.849273 1701291 ssh_runner.go:195] Run: openssl version
	I1124 09:19:25.855675 1701291 command_runner.go:130] > OpenSSL 3.0.17 1 Jul 2025 (Library: OpenSSL 3.0.17 1 Jul 2025)
	I1124 09:19:25.855803 1701291 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1124 09:19:25.864243 1701291 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:19:25.867919 1701291 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Nov 24 08:44 /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:19:25.867982 1701291 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Nov 24 08:44 /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:19:25.868042 1701291 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:19:25.908611 1701291 command_runner.go:130] > b5213941
	I1124 09:19:25.909123 1701291 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1124 09:19:25.916880 1701291 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1654467.pem && ln -fs /usr/share/ca-certificates/1654467.pem /etc/ssl/certs/1654467.pem"
	I1124 09:19:25.925097 1701291 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1654467.pem
	I1124 09:19:25.928711 1701291 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Nov 24 09:10 /usr/share/ca-certificates/1654467.pem
	I1124 09:19:25.928823 1701291 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Nov 24 09:10 /usr/share/ca-certificates/1654467.pem
	I1124 09:19:25.928900 1701291 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1654467.pem
	I1124 09:19:25.969833 1701291 command_runner.go:130] > 51391683
	I1124 09:19:25.970298 1701291 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1654467.pem /etc/ssl/certs/51391683.0"
	I1124 09:19:25.978202 1701291 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/16544672.pem && ln -fs /usr/share/ca-certificates/16544672.pem /etc/ssl/certs/16544672.pem"
	I1124 09:19:25.986297 1701291 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/16544672.pem
	I1124 09:19:25.989958 1701291 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Nov 24 09:10 /usr/share/ca-certificates/16544672.pem
	I1124 09:19:25.990028 1701291 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Nov 24 09:10 /usr/share/ca-certificates/16544672.pem
	I1124 09:19:25.990094 1701291 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/16544672.pem
	I1124 09:19:26.030947 1701291 command_runner.go:130] > 3ec20f2e
	I1124 09:19:26.031428 1701291 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/16544672.pem /etc/ssl/certs/3ec20f2e.0"
	I1124 09:19:26.039972 1701291 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1124 09:19:26.043966 1701291 command_runner.go:130] >   File: /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1124 09:19:26.043995 1701291 command_runner.go:130] >   Size: 1176      	Blocks: 8          IO Block: 4096   regular file
	I1124 09:19:26.044001 1701291 command_runner.go:130] > Device: 259,1	Inode: 1320367     Links: 1
	I1124 09:19:26.044008 1701291 command_runner.go:130] > Access: (0644/-rw-r--r--)  Uid: (    0/    root)   Gid: (    0/    root)
	I1124 09:19:26.044023 1701291 command_runner.go:130] > Access: 2025-11-24 09:15:17.409446871 +0000
	I1124 09:19:26.044028 1701291 command_runner.go:130] > Modify: 2025-11-24 09:11:12.722825550 +0000
	I1124 09:19:26.044034 1701291 command_runner.go:130] > Change: 2025-11-24 09:11:12.722825550 +0000
	I1124 09:19:26.044039 1701291 command_runner.go:130] >  Birth: 2025-11-24 09:11:12.722825550 +0000
	I1124 09:19:26.044132 1701291 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1124 09:19:26.086676 1701291 command_runner.go:130] > Certificate will not expire
	I1124 09:19:26.086876 1701291 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1124 09:19:26.129915 1701291 command_runner.go:130] > Certificate will not expire
	I1124 09:19:26.130020 1701291 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1124 09:19:26.173544 1701291 command_runner.go:130] > Certificate will not expire
	I1124 09:19:26.174084 1701291 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1124 09:19:26.214370 1701291 command_runner.go:130] > Certificate will not expire
	I1124 09:19:26.214874 1701291 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1124 09:19:26.257535 1701291 command_runner.go:130] > Certificate will not expire
	I1124 09:19:26.257999 1701291 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1124 09:19:26.298467 1701291 command_runner.go:130] > Certificate will not expire
	I1124 09:19:26.298937 1701291 kubeadm.go:401] StartCluster: {Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikubeCA
APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false
CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:19:26.299045 1701291 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1124 09:19:26.299146 1701291 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1124 09:19:26.324900 1701291 cri.go:89] found id: ""
	I1124 09:19:26.325047 1701291 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1124 09:19:26.331898 1701291 command_runner.go:130] > /var/lib/kubelet/config.yaml
	I1124 09:19:26.331976 1701291 command_runner.go:130] > /var/lib/kubelet/kubeadm-flags.env
	I1124 09:19:26.331999 1701291 command_runner.go:130] > /var/lib/minikube/etcd:
	I1124 09:19:26.332730 1701291 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1124 09:19:26.332771 1701291 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1124 09:19:26.332851 1701291 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1124 09:19:26.340023 1701291 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1124 09:19:26.340455 1701291 kubeconfig.go:47] verify endpoint returned: get endpoint: "functional-291288" does not appear in /home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 09:19:26.340556 1701291 kubeconfig.go:62] /home/jenkins/minikube-integration/21978-1652607/kubeconfig needs updating (will repair): [kubeconfig missing "functional-291288" cluster setting kubeconfig missing "functional-291288" context setting]
	I1124 09:19:26.340827 1701291 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/kubeconfig: {Name:mk02121ae6148bede61eabf0ed4e1826024715f4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 09:19:26.341245 1701291 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 09:19:26.341410 1701291 kapi.go:59] client config for functional-291288: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.crt", KeyFile:"/home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.key", CAFile:"/home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(ni
l), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb2df0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1124 09:19:26.341966 1701291 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1124 09:19:26.341987 1701291 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1124 09:19:26.341993 1701291 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1124 09:19:26.341999 1701291 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1124 09:19:26.342005 1701291 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1124 09:19:26.342302 1701291 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1124 09:19:26.342404 1701291 cert_rotation.go:141] "Starting client certificate rotation controller" logger="tls-transport-cache"
	I1124 09:19:26.349720 1701291 kubeadm.go:635] The running cluster does not require reconfiguration: 192.168.49.2
	I1124 09:19:26.349757 1701291 kubeadm.go:602] duration metric: took 16.96677ms to restartPrimaryControlPlane
	I1124 09:19:26.349768 1701291 kubeadm.go:403] duration metric: took 50.840633ms to StartCluster
	I1124 09:19:26.349802 1701291 settings.go:142] acquiring lock: {Name:mk6c04793f5fd4f38f92abf4357247f2ccd7fc4e Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 09:19:26.349888 1701291 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 09:19:26.350548 1701291 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/kubeconfig: {Name:mk02121ae6148bede61eabf0ed4e1826024715f4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 09:19:26.350757 1701291 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1124 09:19:26.351051 1701291 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1124 09:19:26.351103 1701291 addons.go:527] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I1124 09:19:26.351171 1701291 addons.go:70] Setting storage-provisioner=true in profile "functional-291288"
	I1124 09:19:26.351184 1701291 addons.go:239] Setting addon storage-provisioner=true in "functional-291288"
	I1124 09:19:26.351210 1701291 host.go:66] Checking if "functional-291288" exists ...
	I1124 09:19:26.351260 1701291 addons.go:70] Setting default-storageclass=true in profile "functional-291288"
	I1124 09:19:26.351281 1701291 addons_storage_classes.go:34] enableOrDisableStorageClasses default-storageclass=true on "functional-291288"
	I1124 09:19:26.351591 1701291 cli_runner.go:164] Run: docker container inspect functional-291288 --format={{.State.Status}}
	I1124 09:19:26.351665 1701291 cli_runner.go:164] Run: docker container inspect functional-291288 --format={{.State.Status}}
	I1124 09:19:26.356026 1701291 out.go:179] * Verifying Kubernetes components...
	I1124 09:19:26.358753 1701291 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1124 09:19:26.386934 1701291 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 09:19:26.387124 1701291 kapi.go:59] client config for functional-291288: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.crt", KeyFile:"/home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.key", CAFile:"/home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(ni
l), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb2df0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1124 09:19:26.387397 1701291 addons.go:239] Setting addon default-storageclass=true in "functional-291288"
	I1124 09:19:26.387423 1701291 host.go:66] Checking if "functional-291288" exists ...
	I1124 09:19:26.387832 1701291 cli_runner.go:164] Run: docker container inspect functional-291288 --format={{.State.Status}}
	I1124 09:19:26.389901 1701291 out.go:179]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1124 09:19:26.395008 1701291 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:26.395037 1701291 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1124 09:19:26.395101 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:26.420232 1701291 addons.go:436] installing /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:26.420253 1701291 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1124 09:19:26.420313 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:26.425570 1701291 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:19:26.456516 1701291 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:19:26.560922 1701291 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1124 09:19:26.576856 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:26.613035 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:27.382844 1701291 node_ready.go:35] waiting up to 6m0s for node "functional-291288" to be "Ready" ...
	I1124 09:19:27.383045 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:27.383222 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:27.383136 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:27.383333 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:27.383470 1701291 retry.go:31] will retry after 330.402351ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:27.383574 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:27.383622 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:27.383641 1701291 retry.go:31] will retry after 362.15201ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:27.383749 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:27.714181 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:27.746972 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:27.795758 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:27.795808 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:27.795853 1701291 retry.go:31] will retry after 486.739155ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:27.825835 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:27.825930 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:27.825968 1701291 retry.go:31] will retry after 300.110995ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:27.884058 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:27.884183 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:27.884499 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:28.126983 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:28.217006 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:28.217052 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:28.217072 1701291 retry.go:31] will retry after 300.765079ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:28.283248 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:28.347318 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:28.347417 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:28.347441 1701291 retry.go:31] will retry after 303.335388ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:28.383528 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:28.383642 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:28.383982 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:28.518292 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:28.580592 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:28.580640 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:28.580660 1701291 retry.go:31] will retry after 1.066338993s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:28.651903 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:28.713844 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:28.713897 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:28.713918 1701291 retry.go:31] will retry after 1.056665241s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:28.884118 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:28.884220 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:28.884569 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:29.383298 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:29.383424 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:29.383770 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:29.383848 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:29.647985 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:29.716805 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:29.720169 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:29.720200 1701291 retry.go:31] will retry after 944.131514ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:29.771443 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:29.838798 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:29.842880 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:29.842911 1701291 retry.go:31] will retry after 1.275018698s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:29.883132 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:29.883209 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:29.883509 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:30.383649 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:30.383776 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:30.384127 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:30.664505 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:30.720036 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:30.723467 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:30.723535 1701291 retry.go:31] will retry after 2.138623105s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:30.883817 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:30.883887 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:30.884224 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:31.118957 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:31.199799 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:31.199840 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:31.199882 1701291 retry.go:31] will retry after 2.182241097s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:31.383252 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:31.383376 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:31.383741 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:31.883141 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:31.883218 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:31.883484 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:31.883535 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:32.383203 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:32.383282 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:32.383615 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:32.863283 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:32.883678 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:32.883784 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:32.884128 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:32.923038 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:32.923079 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:32.923098 1701291 retry.go:31] will retry after 3.572603171s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:33.382308 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:33.383761 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:33.383826 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:33.384119 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:33.453074 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:33.453119 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:33.453141 1701291 retry.go:31] will retry after 3.109489242s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:33.883699 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:33.883773 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:33.884102 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:33.884157 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:34.383924 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:34.383999 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:34.384345 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:34.883591 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:34.883679 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:34.883980 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:35.383814 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:35.383894 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:35.384241 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:35.884036 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:35.884171 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:35.884537 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:35.884594 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:36.383696 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:36.383766 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:36.384025 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:36.496437 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:36.551663 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:36.555562 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:36.555638 1701291 retry.go:31] will retry after 5.073494199s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:36.562783 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:36.628271 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:36.628317 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:36.628342 1701291 retry.go:31] will retry after 5.770336946s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:36.883845 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:36.883918 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:36.884243 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:37.384077 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:37.384153 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:37.384472 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:37.883154 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:37.883226 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:37.883536 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:38.383157 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:38.383232 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:38.383563 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:38.383620 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:38.883187 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:38.883316 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:38.883616 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:39.383889 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:39.383969 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:39.384246 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:39.884093 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:39.884182 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:39.884521 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:40.383195 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:40.383272 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:40.383608 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:40.383663 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:40.883068 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:40.883144 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:40.883421 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:41.383174 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:41.383248 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:41.383670 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:41.630088 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:41.704671 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:41.704728 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:41.704747 1701291 retry.go:31] will retry after 8.448093656s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:41.884076 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:41.884161 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:41.884479 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:42.383803 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:42.383879 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:42.384141 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:42.384184 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:42.399541 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:42.476011 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:42.476071 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:42.476093 1701291 retry.go:31] will retry after 9.502945959s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:42.883588 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:42.883671 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:42.884026 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:43.383828 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:43.383907 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:43.384181 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:43.883670 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:43.883743 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:43.884060 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:44.383696 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:44.383771 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:44.384127 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:44.384222 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:44.883976 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:44.884053 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:44.884413 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:45.383648 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:45.383811 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:45.384197 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:45.883998 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:45.884089 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:45.884467 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:46.383603 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:46.383678 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:46.384022 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:46.883619 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:46.883699 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:46.883981 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:46.884038 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:47.383777 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:47.383855 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:47.384200 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:47.883911 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:47.884016 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:47.884384 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:48.383668 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:48.383739 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:48.384087 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:48.883874 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:48.883952 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:48.884283 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:48.884343 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:49.383082 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:49.383173 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:49.383540 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:49.883082 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:49.883151 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:49.883411 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:50.153986 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:50.216789 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:50.216837 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:50.216857 1701291 retry.go:31] will retry after 12.027560843s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:50.383117 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:50.383226 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:50.383583 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:50.883287 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:50.883368 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:50.883726 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:51.383622 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:51.383710 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:51.384038 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:51.384100 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:51.883690 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:51.883770 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:51.884105 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:51.979351 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:52.048232 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:52.048287 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:52.048307 1701291 retry.go:31] will retry after 5.922680138s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:52.383846 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:52.383926 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:52.384262 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:52.883642 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:52.883714 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:52.884029 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:53.383844 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:53.383917 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:53.384249 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:53.384309 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:53.884029 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:53.884108 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:53.884493 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:54.383680 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:54.383755 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:54.384008 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:54.883852 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:54.883926 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:54.884262 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:55.384060 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:55.384132 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:55.384467 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:55.384528 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:55.883800 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:55.883874 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:55.884153 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:56.383266 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:56.383344 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:56.383682 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:56.883176 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:56.883258 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:56.883607 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:57.383853 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:57.383936 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:57.384284 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:57.884078 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:57.884157 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:57.884542 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:57.884608 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:57.972042 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:58.032393 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:58.036131 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:58.036169 1701291 retry.go:31] will retry after 15.323516146s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:58.383700 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:58.383776 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:58.384074 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:58.883637 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:58.883711 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:58.883992 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:59.383767 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:59.383847 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:59.384170 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:59.883954 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:59.884029 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:59.884364 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:00.386702 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:00.386929 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:00.387350 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:00.387652 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:00.883998 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:00.884089 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:00.884461 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:01.383250 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:01.383328 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:01.383704 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:01.883996 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:01.884068 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:01.884357 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:02.244687 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:20:02.303604 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:20:02.306952 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:02.306992 1701291 retry.go:31] will retry after 20.630907774s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:02.383202 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:02.383281 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:02.383599 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:02.883330 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:02.883410 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:02.883745 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:02.883800 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:03.383311 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:03.383386 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:03.383651 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:03.883196 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:03.883275 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:03.883594 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:04.383175 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:04.383259 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:04.383568 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:04.883120 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:04.883192 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:04.883478 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:05.383217 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:05.383295 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:05.383624 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:05.383680 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:05.883368 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:05.883446 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:05.883773 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:06.383723 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:06.383806 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:06.384068 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:06.883869 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:06.883945 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:06.884264 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:07.384063 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:07.384138 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:07.384462 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:07.384526 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:07.883109 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:07.883188 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:07.883446 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:08.383152 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:08.383224 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:08.383566 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:08.883199 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:08.883279 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:08.883603 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:09.383126 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:09.383212 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:09.383470 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:09.883162 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:09.883254 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:09.883549 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:09.883599 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:10.383190 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:10.383264 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:10.383586 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:10.883817 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:10.883892 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:10.884145 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:11.383273 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:11.383344 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:11.383622 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:11.883313 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:11.883389 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:11.883749 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:11.883806 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:12.383448 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:12.383520 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:12.383791 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:12.883177 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:12.883257 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:12.883572 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:13.360275 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:20:13.383805 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:13.383886 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:13.384154 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:13.423794 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:20:13.423847 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:13.423866 1701291 retry.go:31] will retry after 19.725114159s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:13.884034 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:13.884124 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:13.884430 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:13.884481 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:14.383158 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:14.383258 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:14.383624 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:14.883202 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:14.883284 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:14.883644 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:15.383356 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:15.383435 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:15.383734 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:15.883472 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:15.883549 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:15.883909 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:16.384044 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:16.384118 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:16.384464 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:16.384554 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:16.883205 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:16.883292 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:16.883609 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:17.383212 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:17.383289 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:17.383587 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:17.883207 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:17.883289 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:17.883594 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:18.383758 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:18.383840 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:18.384110 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:18.883984 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:18.884085 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:18.884539 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:18.884620 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:19.383195 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:19.383308 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:19.383679 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:19.883124 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:19.883204 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:19.883474 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:20.383183 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:20.383264 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:20.383612 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:20.883327 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:20.883410 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:20.883750 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:21.383113 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:21.383189 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:21.383447 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:21.383491 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:21.883186 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:21.883258 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:21.883619 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:22.383192 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:22.383277 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:22.383650 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:22.883350 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:22.883422 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:22.883692 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:22.939045 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:20:23.002892 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:20:23.002941 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:23.002963 1701291 retry.go:31] will retry after 24.365576381s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:23.384046 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:23.384125 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:23.384460 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:23.384522 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:23.883216 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:23.883293 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:23.883634 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:24.383833 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:24.383929 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:24.384212 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:24.884088 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:24.884168 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:24.884519 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:25.383227 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:25.383307 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:25.383654 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:25.883912 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:25.883982 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:25.884337 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:25.884396 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:26.383528 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:26.383619 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:26.383952 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:26.883735 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:26.883810 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:26.884149 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:27.383645 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:27.383725 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:27.384079 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:27.883693 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:27.883792 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:27.884080 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:28.383869 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:28.383941 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:28.384276 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:28.384333 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:28.883621 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:28.883696 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:28.884021 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:29.383693 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:29.383768 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:29.384125 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:29.883838 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:29.883920 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:29.884279 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:30.383628 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:30.383705 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:30.383961 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:30.883414 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:30.883492 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:30.883837 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:30.883893 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:31.383689 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:31.383767 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:31.384087 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:31.883629 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:31.883699 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:31.883964 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:32.383819 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:32.383897 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:32.384254 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:32.884067 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:32.884145 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:32.884453 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:32.884504 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:33.149949 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:20:33.204697 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:20:33.208037 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:33.208070 1701291 retry.go:31] will retry after 22.392696015s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:33.383469 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:33.383538 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:33.383796 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:33.883550 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:33.883634 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:33.883947 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:34.383737 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:34.383811 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:34.384171 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:34.883654 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:34.883734 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:34.884066 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:35.383856 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:35.383928 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:35.384271 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:35.384326 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:35.883926 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:35.884005 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:35.884370 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:36.383314 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:36.383384 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:36.383644 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:36.883149 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:36.883225 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:36.883565 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:37.383275 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:37.383359 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:37.383702 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:37.883387 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:37.883466 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:37.883722 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:37.883762 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:38.383178 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:38.383252 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:38.383603 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:38.883170 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:38.883244 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:38.883650 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:39.383205 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:39.383274 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:39.383534 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:39.883192 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:39.883267 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:39.883632 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:40.383376 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:40.383463 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:40.383839 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:40.383896 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:40.883093 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:40.883170 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:40.883479 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:41.383194 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:41.383276 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:41.383635 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:41.883337 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:41.883422 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:41.883716 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:42.383386 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:42.383461 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:42.383814 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:42.883190 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:42.883266 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:42.883601 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:42.883670 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:43.383217 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:43.383293 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:43.383671 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:43.883117 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:43.883198 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:43.883473 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:44.383174 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:44.383255 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:44.383558 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:44.883289 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:44.883363 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:44.883641 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:45.383065 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:45.383134 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:45.383415 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:45.383456 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:45.883191 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:45.883274 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:45.883563 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:46.383411 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:46.383487 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:46.383849 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:46.883402 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:46.883490 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:46.883752 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:47.369539 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:20:47.383080 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:47.383149 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:47.383440 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:47.383498 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:47.426348 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:20:47.429646 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:47.429686 1701291 retry.go:31] will retry after 22.399494886s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:47.883262 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:47.883365 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:47.883699 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:48.383121 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:48.383192 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:48.383450 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:48.883172 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:48.883246 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:48.883565 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:49.383175 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:49.383254 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:49.383619 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:49.383673 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:49.883307 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:49.883381 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:49.883648 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:50.383167 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:50.383242 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:50.383602 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:50.883305 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:50.883403 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:50.883701 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:51.383597 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:51.383671 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:51.383949 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:51.383999 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:51.883799 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:51.883891 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:51.884215 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:52.383953 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:52.384046 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:52.384337 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:52.883622 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:52.883695 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:52.883974 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:53.383750 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:53.383840 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:53.384189 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:53.384246 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:53.883872 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:53.883946 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:53.884278 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:54.383691 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:54.383768 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:54.384062 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:54.883870 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:54.883951 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:54.884279 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:55.384078 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:55.384159 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:55.384531 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:55.384594 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:55.601942 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:20:55.661064 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:20:55.665031 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:20:55.665156 1701291 out.go:285] ! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1124 09:20:55.883471 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:55.883549 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:55.883839 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:56.384006 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:56.384085 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:56.384438 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:56.883159 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:56.883256 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:56.883546 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:57.383058 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:57.383127 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:57.383401 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:57.883132 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:57.883210 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:57.883522 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:57.883572 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:58.383147 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:58.383243 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:58.383538 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:58.883654 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:58.883729 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:58.883987 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:59.383805 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:59.383888 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:59.384179 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:59.883985 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:59.884058 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:59.884355 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:59.884403 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:00.383754 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:00.383833 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:00.384151 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:00.883935 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:00.884016 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:00.884352 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:01.383267 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:01.383344 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:01.383652 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:01.883147 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:01.883222 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:01.883556 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:02.383255 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:02.383332 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:02.383663 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:02.383721 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:02.883448 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:02.883530 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:02.883895 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:03.383623 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:03.383692 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:03.383959 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:03.883727 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:03.883833 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:03.884183 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:04.383989 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:04.384068 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:04.384431 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:04.384491 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:04.883658 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:04.883737 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:04.884051 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:05.383792 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:05.383863 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:05.384221 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:05.883873 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:05.883951 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:05.884288 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:06.383270 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:06.383343 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:06.383618 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:06.883169 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:06.883268 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:06.883573 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:06.883620 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:07.383342 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:07.383427 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:07.383765 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:07.884027 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:07.884094 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:07.884425 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:08.383164 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:08.383239 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:08.383598 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:08.883308 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:08.883398 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:08.883741 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:08.883802 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:09.383098 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:09.383166 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:09.383423 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:09.830147 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:21:09.883707 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:09.883815 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:09.884234 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:09.887265 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:21:09.890761 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:21:09.890861 1701291 out.go:285] ! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1124 09:21:09.895836 1701291 out.go:179] * Enabled addons: 
	I1124 09:21:09.899594 1701291 addons.go:530] duration metric: took 1m43.548488453s for enable addons: enabled=[]
	I1124 09:21:10.383381 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:10.383468 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:10.383851 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:10.883541 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:10.883612 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:10.883871 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:10.883921 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:11.383721 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:11.383804 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:11.384146 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:11.883758 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:11.883832 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:11.884153 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:12.383650 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:12.383725 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:12.383994 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:12.883791 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:12.883869 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:12.884200 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:12.884259 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:13.384051 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:13.384130 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:13.384481 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:13.883069 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:13.883147 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:13.883443 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:14.383158 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:14.383256 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:14.383600 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:14.883308 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:14.883386 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:14.883743 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:15.383457 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:15.383524 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:15.383790 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:15.383833 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:15.883160 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:15.883235 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:15.883570 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:16.383347 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:16.383428 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:16.383759 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:16.883325 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:16.883399 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:16.883664 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:17.383191 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:17.383290 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:17.383661 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:17.883228 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:17.883306 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:17.883672 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:17.883730 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:18.383978 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:18.384061 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:18.384373 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:18.883112 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:18.883209 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:18.883544 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:19.383112 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:19.383198 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:19.383544 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:19.883660 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:19.883735 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:19.883994 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:19.884034 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:20.383840 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:20.383939 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:20.384276 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:20.884063 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:20.884139 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:20.884609 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:21.383122 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:21.383191 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:21.383474 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:21.883229 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:21.883311 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:21.883643 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:22.383182 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:22.383259 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:22.383610 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:22.383663 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:22.884007 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:22.884077 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:22.884343 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:23.384148 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:23.384238 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:23.384581 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:23.883132 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:23.883207 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:23.883554 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:24.383088 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:24.383159 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:24.383481 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:24.883179 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:24.883275 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:24.883610 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:24.883675 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:25.383186 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:25.383268 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:25.383608 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:25.883472 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:25.883586 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:25.884129 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:26.383146 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:26.383230 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:26.383577 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:26.883188 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:26.883299 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:26.883678 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:26.883758 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:27.384111 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:27.384181 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:27.384491 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:27.883092 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:27.883171 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:27.883515 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:28.383158 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:28.383240 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:28.383623 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:28.883309 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:28.883385 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:28.883717 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:29.383191 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:29.383266 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:29.383650 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:29.383702 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:29.883188 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:29.883275 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:29.883613 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:30.383126 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:30.383193 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:30.383490 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:30.883171 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:30.883254 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:30.883605 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:31.383171 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:31.383250 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:31.383601 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:31.883122 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:31.883191 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:31.883449 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:31.883489 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:32.383217 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:32.383291 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:32.383620 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:32.883183 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:32.883260 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:32.883629 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:33.383180 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:33.383262 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:33.383549 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:33.883190 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:33.883271 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:33.883638 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:33.883694 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:34.383256 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:34.383337 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:34.383680 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:34.883132 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:34.883214 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:34.883526 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:35.383172 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:35.383252 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:35.383615 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:35.883199 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:35.883282 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:35.883582 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:36.383281 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:36.383351 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:36.383609 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:36.383650 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:36.883304 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:36.883386 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:36.883706 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:37.383434 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:37.383512 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:37.383858 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:37.883555 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:37.883635 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:37.883920 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:38.383713 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:38.383800 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:38.384150 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:38.384211 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:38.884005 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:38.884085 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:38.884432 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:39.383117 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:39.383189 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:39.383470 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:39.883210 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:39.883320 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:39.883681 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:40.383192 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:40.383273 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:40.383648 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:40.883329 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:40.883413 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:40.883677 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:40.883719 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:41.383810 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:41.383891 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:41.384260 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:41.884110 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:41.884211 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:41.884610 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:42.383111 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:42.383184 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:42.383469 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:42.883146 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:42.883219 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:42.883556 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:43.383303 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:43.383390 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:43.383815 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:43.383880 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:43.884150 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:43.884225 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:43.884489 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:44.383187 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:44.383285 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:44.383631 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:44.883347 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:44.883424 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:44.883787 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:45.383143 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:45.383221 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:45.383485 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:45.883220 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:45.883291 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:45.883631 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:45.883683 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:46.383565 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:46.383643 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:46.384005 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:46.883681 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:46.883753 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:46.884095 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:47.383931 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:47.384032 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:47.384438 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:47.884098 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:47.884173 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:47.884475 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:47.884521 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:48.383141 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:48.383214 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:48.383504 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:48.883189 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:48.883295 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:48.883641 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:49.383237 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:49.383316 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:49.383651 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:49.883138 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:49.883214 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:49.883514 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:50.383163 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:50.383242 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:50.383592 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:50.383651 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:50.883194 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:50.883284 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:50.883599 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:51.383074 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:51.383155 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:51.383436 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:51.883141 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:51.883231 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:51.883582 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:52.383155 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:52.383242 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:52.383575 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:52.883252 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:52.883327 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:52.883595 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:52.883642 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:53.383321 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:53.383392 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:53.383737 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:53.883216 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:53.883293 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:53.883646 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:54.383339 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:54.383413 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:54.383688 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:54.883201 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:54.883274 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:54.883590 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:55.383288 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:55.383366 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:55.383704 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:55.383769 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:55.883435 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:55.883505 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:55.883816 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:56.384009 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:56.384088 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:56.384422 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:56.883137 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:56.883222 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:56.883558 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:57.383818 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:57.383897 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:57.384172 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:57.384212 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:57.883977 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:57.884053 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:57.884399 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:58.383153 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:58.383233 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:58.383556 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:58.883101 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:58.883177 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:58.883433 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:59.383157 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:59.383236 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:59.383565 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:59.883168 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:59.883258 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:59.883650 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:59.883705 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:00.383305 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:00.383386 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:00.383837 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:00.883166 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:00.883245 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:00.883577 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:01.383186 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:01.383267 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:01.383606 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:01.883853 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:01.883923 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:01.884206 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:01.884257 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:02.384016 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:02.384095 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:02.384455 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:02.884100 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:02.884181 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:02.884522 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:03.383131 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:03.383207 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:03.383521 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:03.883200 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:03.883287 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:03.883604 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:04.383190 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:04.383274 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:04.383643 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:04.383702 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:04.883207 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:04.883279 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:04.883551 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:05.383188 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:05.383272 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:05.383607 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:05.883177 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:05.883257 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:05.883627 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:06.383792 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:06.383879 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:06.384240 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:06.384291 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:06.884040 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:06.884120 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:06.884445 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:07.383154 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:07.383230 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:07.383566 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:07.883872 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:07.883944 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:07.884212 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:08.383967 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:08.384042 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:08.384363 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:08.384428 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:08.883105 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:08.883184 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:08.883520 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:09.383654 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:09.383727 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:09.384039 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:09.883710 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:09.883788 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:09.884141 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:10.383940 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:10.384022 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:10.384358 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:10.883643 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:10.883717 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:10.883979 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:10.884026 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:11.384043 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:11.384119 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:11.384476 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:11.884107 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:11.884182 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:11.884497 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:12.383080 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:12.383156 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:12.383420 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:12.883114 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:12.883197 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:12.883546 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:13.383148 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:13.383235 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:13.383567 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:13.383626 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:13.883128 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:13.883206 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:13.883519 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:14.383161 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:14.383237 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:14.383589 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:14.883306 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:14.883385 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:14.883735 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:15.384005 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:15.384082 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:15.384357 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:15.384407 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:15.883074 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:15.883147 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:15.883531 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:16.383351 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:16.383433 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:16.383810 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:16.883361 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:16.883437 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:16.883741 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:17.383154 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:17.383240 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:17.383580 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:17.883164 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:17.883241 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:17.883543 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:17.883590 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:18.383110 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:18.383195 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:18.383512 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:18.883210 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:18.883284 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:18.883632 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:19.383181 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:19.383265 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:19.383589 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:19.883083 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:19.883153 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:19.883418 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:20.383720 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:20.383806 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:20.384138 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:20.384189 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:20.883895 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:20.883977 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:20.884383 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:21.383110 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:21.383179 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:21.383449 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:21.883148 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:21.883222 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:21.883554 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:22.383181 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:22.383267 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:22.383745 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:22.883438 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:22.883512 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:22.883827 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:22.883878 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:23.383219 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:23.383300 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:23.383650 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:23.883352 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:23.883439 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:23.883739 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:24.383094 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:24.383172 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:24.383441 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:24.883170 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:24.883246 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:24.883573 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:25.383180 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:25.383258 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:25.383557 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:25.383602 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:25.883124 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:25.883200 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:25.883530 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:26.383418 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:26.383502 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:26.383820 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:26.883156 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:26.883232 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:26.883574 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:27.383253 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:27.383325 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:27.383640 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:27.383695 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:27.883229 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:27.883308 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:27.883663 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:28.383352 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:28.383428 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:28.383771 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:28.883152 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:28.883224 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:28.883533 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:29.383259 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:29.383346 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:29.383718 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:29.383781 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:29.883468 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:29.883551 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:29.883860 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:30.383098 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:30.383174 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:30.383431 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:30.883167 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:30.883258 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:30.883626 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:31.383185 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:31.383269 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:31.383600 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:31.883135 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:31.883200 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:31.883477 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:31.883524 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:32.383251 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:32.383334 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:32.383667 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:32.883186 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:32.883294 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:32.883590 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:33.383124 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:33.383196 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:33.383537 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:33.883238 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:33.883319 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:33.883668 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:33.883725 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:34.383411 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:34.383500 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:34.383842 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:34.883113 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:34.883201 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:34.883459 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:35.383178 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:35.383251 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:35.383573 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:35.883163 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:35.883245 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:35.883570 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:36.383743 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:36.383821 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:36.384077 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:36.384116 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:36.883871 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:36.883954 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:36.884285 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:37.384043 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:37.384116 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:37.384446 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:37.883126 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:37.883195 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:37.883464 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:38.383151 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:38.383233 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:38.383571 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:38.883272 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:38.883352 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:38.883655 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:38.883702 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:39.383349 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:39.383416 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:39.383686 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:39.883194 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:39.883268 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:39.883616 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:40.383355 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:40.383439 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:40.383825 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:40.883050 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:40.883119 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:40.883381 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:41.383178 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:41.383263 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:41.383602 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:41.383659 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:41.883334 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:41.883418 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:41.883737 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:42.383098 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:42.383164 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:42.383505 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:42.883181 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:42.883256 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:42.883607 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:43.383328 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:43.383407 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:43.383779 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:43.383848 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:43.883040 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:43.883108 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:43.883373 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:44.383062 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:44.383137 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:44.383488 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:44.883210 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:44.883294 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:44.883624 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:45.383177 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:45.383283 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:45.383549 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:45.883285 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:45.883371 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:45.883679 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:45.883730 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:46.383910 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:46.383988 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:46.384338 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:46.883634 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:46.883708 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:46.883972 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:47.383794 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:47.383890 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:47.384333 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:47.883084 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:47.883172 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:47.883512 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:48.383207 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:48.383278 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:48.383553 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:48.383599 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:48.883146 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:48.883219 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:48.883545 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:49.383190 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:49.383267 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:49.383618 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:49.883863 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:49.883935 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:49.884201 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:50.384017 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:50.384095 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:50.384461 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:50.384517 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:50.883189 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:50.883269 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:50.883636 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:51.383067 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:51.383134 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:51.383393 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:51.883095 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:51.883170 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:51.883486 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:52.383088 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:52.383168 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:52.383503 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:52.883649 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:52.883715 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:52.883972 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:52.884013 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:53.383510 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:53.383586 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:53.383942 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:53.883728 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:53.883810 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:53.884186 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:54.383720 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:54.383800 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:54.384075 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:54.883881 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:54.883959 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:54.884315 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:54.884374 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:55.383090 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:55.383174 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:55.383511 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:55.883189 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:55.883267 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:55.883536 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:56.383980 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:56.384072 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:56.384430 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:56.883181 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:56.883264 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:56.883592 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:57.383208 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:57.383283 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:57.383562 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:57.383632 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:57.883178 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:57.883262 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:57.883557 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:58.383270 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:58.383366 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:58.383681 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:58.883065 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:58.883142 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:58.883409 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:59.383139 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:59.383281 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:59.383599 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:59.883295 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:59.883368 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:59.883709 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:59.883767 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:00.383455 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:00.383535 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:00.383834 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:00.883728 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:00.883804 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:00.884143 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:01.383926 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:01.384011 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:01.384371 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:01.883658 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:01.883732 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:01.884049 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:01.884099 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:02.383854 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:02.383936 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:02.384276 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:02.883965 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:02.884044 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:02.884421 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:03.383112 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:03.383191 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:03.383460 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:03.883214 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:03.883310 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:03.883701 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:04.383439 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:04.383520 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:04.383845 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:04.383903 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:04.883132 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:04.883203 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:04.883548 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:05.383170 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:05.383248 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:05.383591 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:05.883308 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:05.883386 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:05.883685 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:06.383882 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:06.383959 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:06.384277 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:06.384350 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:06.884102 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:06.884178 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:06.884513 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:07.383085 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:07.383184 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:07.383543 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:07.883883 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:07.883956 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:07.884221 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:08.384050 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:08.384123 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:08.384452 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:08.384509 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:08.883183 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:08.883259 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:08.883584 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:09.383246 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:09.383318 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:09.383640 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:09.883219 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:09.883299 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:09.883648 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:10.383378 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:10.383458 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:10.383753 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:10.883317 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:10.883388 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:10.883680 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:10.883723 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:11.383702 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:11.383803 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:11.384131 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:11.883721 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:11.883799 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:11.884129 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:12.383663 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:12.383738 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:12.384067 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:12.883854 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:12.883940 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:12.884274 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:12.884334 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:13.384116 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:13.384195 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:13.384538 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:13.883793 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:13.883869 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:13.884135 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:14.383911 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:14.383994 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:14.384297 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:14.883958 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:14.884048 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:14.884401 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:14.884456 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:15.383622 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:15.383700 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:15.383974 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:15.883699 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:15.883778 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:15.884117 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:16.384146 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:16.384226 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:16.384578 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:16.883198 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:16.883271 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:16.883565 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:17.383190 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:17.383273 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:17.383627 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:17.383682 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:17.883355 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:17.883436 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:17.883756 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:18.383116 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:18.383185 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:18.383441 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:18.883189 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:18.883268 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:18.883596 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:19.383187 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:19.383269 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:19.383630 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:19.883313 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:19.883385 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:19.883674 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:19.883725 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:20.383174 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:20.383257 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:20.383614 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:20.883340 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:20.883415 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:20.883771 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:21.383646 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:21.383720 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:21.383985 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:21.883845 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:21.883928 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:21.884313 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:21.884368 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:22.383054 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:22.383138 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:22.383471 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:22.883086 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:22.883170 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:22.883470 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:23.383158 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:23.383233 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:23.383562 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:23.883247 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:23.883321 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:23.883637 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:24.383095 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:24.383165 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:24.383431 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:24.383471 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:24.883124 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:24.883205 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:24.883534 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:25.383173 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:25.383251 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:25.383575 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:25.883126 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:25.883196 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:25.883470 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:26.383072 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:26.383157 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:26.383507 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:26.383568 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:26.883510 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:26.883587 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:26.883957 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:27.383649 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:27.383724 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:27.384102 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:27.883942 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:27.884029 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:27.884418 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:28.383174 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:28.383254 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:28.383611 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:28.383665 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:28.883114 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:28.883191 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:28.883456 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:29.383167 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:29.383248 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:29.383597 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:29.883182 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:29.883258 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:29.883579 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:30.383122 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:30.383199 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:30.383527 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:30.883137 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:30.883215 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:30.883514 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:30.883561 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:31.383185 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:31.383260 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:31.383609 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:31.883900 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:31.883970 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:31.884278 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:32.384086 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:32.384160 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:32.384455 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:32.883182 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:32.883259 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:32.883600 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:32.883657 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:33.383111 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:33.383190 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:33.383455 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:33.883174 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:33.883260 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:33.883641 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:34.383362 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:34.383442 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:34.383802 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:34.883106 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:34.883183 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:34.883439 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:35.383143 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:35.383220 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:35.383551 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:35.383604 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:35.883151 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:35.883229 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:35.883562 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:36.383293 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:36.383366 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:36.383619 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:36.883173 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:36.883255 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:36.883580 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:37.383161 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:37.383237 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:37.383584 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:37.383635 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:37.883107 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:37.883182 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:37.883504 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:38.383163 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:38.383248 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:38.383593 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:38.883178 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:38.883257 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:38.883615 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:39.383868 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:39.383940 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:39.384210 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:39.384251 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:39.883999 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:39.884075 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:39.884422 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:40.383152 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:40.383231 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:40.383560 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:40.883278 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:40.883355 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:40.883619 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:41.383462 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:41.383549 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:41.383883 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:41.883473 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:41.883550 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:41.883893 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:41.883952 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:42.383654 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:42.383728 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:42.384013 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:42.883799 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:42.883875 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:42.884236 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:43.384072 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:43.384157 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:43.384486 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:43.883149 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:43.883222 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:43.883524 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:44.383233 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:44.383315 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:44.383652 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:44.383713 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:44.883155 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:44.883243 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:44.883579 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:45.383126 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:45.383203 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:45.383524 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:45.883188 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:45.883264 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:45.883628 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:46.383346 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:46.383429 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:46.383765 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:46.383819 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:46.883878 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:46.883951 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:46.884224 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:47.384060 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:47.384136 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:47.384469 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:47.883170 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:47.883249 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:47.883589 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:48.383128 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:48.383211 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:48.383474 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:48.883184 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:48.883264 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:48.883602 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:48.883663 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:49.383164 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:49.383241 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:49.383569 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:49.883290 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:49.883367 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:49.883671 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:50.383185 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:50.383268 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:50.383648 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:50.883431 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:50.883514 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:50.883850 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:50.883909 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:51.383652 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:51.383720 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:51.383978 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:51.883444 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:51.883523 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:51.883866 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:52.383586 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:52.383680 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:52.384026 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:52.883655 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:52.883728 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:52.884053 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:52.884105 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:53.383855 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:53.383945 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:53.384271 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:53.884101 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:53.884186 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:53.884529 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:54.383101 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:54.383176 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:54.383443 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:54.883148 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:54.883222 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:54.883575 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:55.383191 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:55.383270 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:55.383608 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:55.383664 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:55.883870 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:55.883946 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:55.884289 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:56.383256 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:56.383351 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:56.383747 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:56.883462 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:56.883538 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:56.883871 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:57.383563 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:57.383638 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:57.383899 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:57.383944 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:57.883683 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:57.883768 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:57.884147 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:58.383932 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:58.384008 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:58.384395 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:58.883091 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:58.883159 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:58.883412 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:59.383084 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:59.383166 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:59.383498 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:59.883178 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:59.883259 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:59.883595 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:59.883655 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:00.392124 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:00.392210 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:00.392556 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:00.883180 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:00.883282 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:00.883653 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:01.383190 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:01.383267 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:01.383567 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:01.883245 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:01.883313 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:01.883583 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:02.383189 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:02.383267 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:02.383605 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:02.383667 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:02.883233 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:02.883317 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:02.883620 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:03.383856 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:03.383927 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:03.384185 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:03.884056 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:03.884135 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:03.884494 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:04.383223 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:04.383311 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:04.383613 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:04.883276 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:04.883345 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:04.883599 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:04.883643 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:05.383163 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:05.383239 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:05.383541 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:05.883213 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:05.883295 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:05.883634 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:06.383304 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:06.383375 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:06.383679 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:06.883401 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:06.883483 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:06.883806 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:06.883865 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:07.383190 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:07.383271 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:07.383648 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:07.883325 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:07.883398 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:07.883710 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:08.383188 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:08.383266 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:08.383566 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:08.883267 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:08.883345 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:08.883690 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:09.384042 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:09.384118 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:09.384458 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:09.384510 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:09.883180 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:09.883253 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:09.883573 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:10.383180 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:10.383261 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:10.383583 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:10.883127 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:10.883204 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:10.883474 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:11.383167 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:11.383240 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:11.383552 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:11.883157 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:11.883234 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:11.883563 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:11.883618 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:12.383084 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:12.383153 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:12.383411 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:12.883181 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:12.883256 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:12.883591 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:13.383188 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:13.383265 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:13.383586 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:13.883123 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:13.883204 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:13.883485 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:14.383559 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:14.383638 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:14.383953 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:14.384012 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:14.883792 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:14.883868 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:14.884213 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:15.383592 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:15.383667 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:15.383925 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:15.883767 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:15.883843 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:15.884202 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:16.383348 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:16.383420 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:16.383758 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:16.883457 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:16.883538 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:16.883795 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:16.883837 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:17.383161 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:17.383238 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:17.383573 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:17.883186 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:17.883271 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:17.883611 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:18.383302 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:18.383377 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:18.383637 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:18.883191 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:18.883269 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:18.883626 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:19.383147 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:19.383224 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:19.383554 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:19.383616 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:19.883116 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:19.883185 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:19.883449 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:20.383135 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:20.383213 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:20.383531 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:20.883180 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:20.883260 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:20.883559 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:21.383124 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:21.383200 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:21.383460 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:21.883132 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:21.883213 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:21.883553 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:21.883608 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:22.383153 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:22.383241 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:22.383543 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:22.883110 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:22.883179 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:22.883439 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:23.383165 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:23.383246 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:23.383640 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:23.883372 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:23.883448 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:23.883789 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:23.883846 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:24.383112 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:24.383188 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:24.383507 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:24.883200 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:24.883275 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:24.883561 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:25.383261 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:25.383336 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:25.383674 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:25.883358 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:25.883437 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:25.883749 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:26.383290 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:26.383368 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:26.383724 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:26.383783 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:26.883478 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:26.883555 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:26.883888 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:27.383604 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:27.383677 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:27.383939 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:27.883757 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:27.883845 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:27.884167 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:28.383852 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:28.383928 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:28.384269 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:28.384325 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:28.883626 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:28.883692 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:28.883958 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:29.383717 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:29.383796 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:29.384139 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:29.883960 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:29.884036 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:29.884369 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:30.383625 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:30.383694 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:30.383980 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:30.883744 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:30.883816 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:30.884150 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:30.884205 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:31.383977 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:31.384060 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:31.384393 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:31.883624 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:31.883716 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:31.883977 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:32.383741 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:32.383814 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:32.384155 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:32.883968 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:32.884055 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:32.884386 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:32.884443 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:33.383735 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:33.383805 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:33.384072 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:33.883915 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:33.883991 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:33.884369 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:34.383120 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:34.383204 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:34.383566 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:34.883846 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:34.883924 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:34.884224 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:35.383982 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:35.384056 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:35.384427 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:35.384483 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:35.884112 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:35.884192 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:35.884530 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:36.383425 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:36.383499 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:36.383766 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:36.883482 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:36.883565 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:36.883947 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:37.383742 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:37.383819 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:37.384158 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:37.883707 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:37.883774 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:37.884034 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:37.884074 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:38.383850 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:38.383958 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:38.384324 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:38.883075 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:38.883152 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:38.883501 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:39.383112 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:39.383186 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:39.383448 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:39.883223 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:39.883319 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:39.883638 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:40.383373 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:40.383445 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:40.383734 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:40.383787 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:40.883050 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:40.883126 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:40.883428 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:41.383217 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:41.383293 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:41.383634 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:41.883211 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:41.883294 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:41.883578 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:42.383069 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:42.383136 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:42.383390 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:42.883177 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:42.883268 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:42.883564 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:42.883610 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:43.383316 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:43.383402 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:43.383752 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:43.884076 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:43.884150 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:43.884466 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:44.383187 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:44.383282 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:44.383645 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:44.883389 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:44.883464 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:44.883804 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:44.883864 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:45.383117 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:45.383195 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:45.383502 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:45.883172 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:45.883255 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:45.883601 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:46.383362 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:46.383444 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:46.383798 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:46.883132 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:46.883204 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:46.883525 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:47.383245 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:47.383343 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:47.383724 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:47.383787 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:47.883322 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:47.883396 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:47.883705 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:48.383414 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:48.383490 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:48.383778 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:48.883192 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:48.883270 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:48.883613 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:49.383457 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:49.383533 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:49.383864 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:49.383922 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:49.883061 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:49.883134 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:49.883396 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:50.383133 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:50.383215 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:50.383592 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:50.883345 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:50.883424 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:50.883767 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:51.383609 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:51.383687 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:51.383946 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:51.383994 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:51.883714 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:51.883789 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:51.884128 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:52.383943 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:52.384028 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:52.384399 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:52.883710 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:52.883786 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:52.884049 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:53.383826 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:53.383902 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:53.384299 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:53.384353 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:53.883075 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:53.883154 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:53.883549 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:54.383241 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:54.383316 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:54.383579 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:54.883201 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:54.883281 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:54.883627 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:55.383208 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:55.383284 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:55.383613 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:55.883300 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:55.883372 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:55.883626 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:55.883666 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:56.383898 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:56.383987 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:56.384342 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:56.883076 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:56.883152 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:56.883529 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:57.383840 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:57.383919 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:57.384396 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:57.883127 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:57.883220 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:57.883601 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:58.383185 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:58.383263 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:58.383601 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:58.383658 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:58.883103 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:58.883174 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:58.883430 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:59.383161 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:59.383239 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:59.383528 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:59.883235 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:59.883319 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:59.883655 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:00.383085 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:00.383175 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:00.383480 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:00.883287 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:00.883398 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:00.883768 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:00.883828 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:01.383727 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:01.383809 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:01.384138 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:01.883728 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:01.883797 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:01.884120 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:02.383919 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:02.383991 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:02.384291 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:02.884049 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:02.884120 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:02.884420 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:02.884485 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:03.383819 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:03.383888 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:03.384209 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:03.884004 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:03.884091 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:03.884451 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:04.384095 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:04.384179 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:04.384501 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:04.883234 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:04.883303 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:04.883584 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:05.383144 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:05.383220 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:05.383542 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:05.383601 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:05.883200 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:05.883285 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:05.883658 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:06.383258 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:06.383334 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:06.383660 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:06.883258 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:06.883336 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:06.883680 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:07.383404 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:07.383485 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:07.383858 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:07.383913 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:07.883619 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:07.883699 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:07.883964 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:08.383741 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:08.383818 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:08.384168 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:08.883989 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:08.884068 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:08.884393 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:09.384092 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:09.384163 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:09.384427 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:09.384467 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:09.883175 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:09.883251 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:09.883564 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:10.383182 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:10.383261 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:10.383593 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:10.883126 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:10.883201 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:10.883461 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:11.383179 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:11.383273 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:11.383596 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:11.883182 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:11.883257 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:11.883609 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:11.883665 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:12.383318 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:12.383399 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:12.383715 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:12.883177 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:12.883251 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:12.883592 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:13.383299 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:13.383377 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:13.383726 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:13.883393 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:13.883461 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:13.883721 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:13.883763 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:14.383180 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:14.383258 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:14.383605 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:14.883184 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:14.883272 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:14.883660 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:15.383351 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:15.383434 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:15.383700 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:15.883201 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:15.883305 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:15.883711 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:16.383360 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:16.383441 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:16.383809 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:16.383867 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:16.883068 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:16.883136 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:16.883406 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:17.383093 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:17.383175 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:17.383513 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:17.883239 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:17.883322 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:17.883695 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:18.383395 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:18.383464 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:18.383742 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:18.883187 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:18.883264 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:18.883645 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:18.883700 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:19.383197 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:19.383275 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:19.383625 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:19.883335 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:19.883406 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:19.883791 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:20.383200 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:20.383305 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:20.383704 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:20.883416 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:20.883493 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:20.883891 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:20.883947 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:21.383661 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:21.383731 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:21.383987 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:21.883737 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:21.883815 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:21.884385 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:22.383106 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:22.383198 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:22.383565 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:22.883149 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:22.883216 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:22.883512 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:23.383192 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:23.383276 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:23.383626 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:23.383680 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:23.883354 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:23.883454 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:23.883802 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:24.383131 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:24.383205 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:24.383521 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:24.883214 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:24.883298 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:24.883675 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:25.383244 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:25.383317 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:25.383636 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:25.883064 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:25.883143 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:25.883420 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:25.883474 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:26.383164 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:26.383254 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:26.383617 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:26.883333 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:26.883410 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:26.883740 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:27.383124 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:27.383182 1701291 node_ready.go:38] duration metric: took 6m0.000242478s for node "functional-291288" to be "Ready" ...
	I1124 09:25:27.386338 1701291 out.go:203] 
	W1124 09:25:27.389204 1701291 out.go:285] X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	W1124 09:25:27.389224 1701291 out.go:285] * 
	* 
	W1124 09:25:27.391374 1701291 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1124 09:25:27.394404 1701291 out.go:203] 

                                                
                                                
** /stderr **
functional_test.go:676: failed to soft start minikube. args "out/minikube-linux-arm64 start -p functional-291288 --alsologtostderr -v=8": exit status 80
functional_test.go:678: soft start took 6m7.013204125s for "functional-291288" cluster.
I1124 09:25:27.879838 1654467 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/SoftStart]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/SoftStart]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect functional-291288
helpers_test.go:243: (dbg) docker inspect functional-291288:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52",
	        "Created": "2025-11-24T09:10:51.896020191Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 1695240,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-11-24T09:10:51.968983407Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:572c983e466f1f784136812eef5cc59ac623db764bc7704d3676c4643993fd08",
	        "ResolvConfPath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/hostname",
	        "HostsPath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/hosts",
	        "LogPath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52-json.log",
	        "Name": "/functional-291288",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "functional-291288:/var",
	                "/lib/modules:/lib/modules:ro"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-291288",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52",
	                "LowerDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7-init/diff:/var/lib/docker/overlay2/bf294786c7ade59cb761c7bdcc27045362c3779b00a569b685443f34ade17737/diff",
	                "MergedDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7/merged",
	                "UpperDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7/diff",
	                "WorkDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "volume",
	                "Name": "functional-291288",
	                "Source": "/var/lib/docker/volumes/functional-291288/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            },
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-291288",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-291288",
	                "name.minikube.sigs.k8s.io": "functional-291288",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "09c1c2eef0dca6362dde63b4cbc372c0cfa3e4fd084b8745043d8b88925691bf",
	            "SandboxKey": "/var/run/docker/netns/09c1c2eef0dc",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34684"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34685"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34688"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34686"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34687"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-291288": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "7e:49:22:0b:f9:2c",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "1e8f91e8ad9f46b831bbb1b0589b0022d940ee9875e64a648dc80612f3ca93dc",
	                    "EndpointID": "5de5ca8ccb07584b21e6e4e30dba12e0233e8d28c3e48e705cddffe75263b337",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-291288",
	                        "70848be15fcc"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-291288 -n functional-291288
helpers_test.go:247: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-291288 -n functional-291288: exit status 2 (344.66775ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:247: status error: exit status 2 (may be ok)
helpers_test.go:252: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/SoftStart FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/SoftStart]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 logs -n 25
helpers_test.go:260: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/SoftStart logs: 
-- stdout --
	
	==> Audit <==
	┌────────────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│    COMMAND     │                                                                          ARGS                                                                           │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├────────────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ ssh            │ functional-941011 ssh sudo umount -f /mount-9p                                                                                                          │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ mount          │ -p functional-941011 /tmp/TestFunctionalparallelMountCmdVerifyCleanup1964745822/001:/mount1 --alsologtostderr -v=1                                      │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ ssh            │ functional-941011 ssh findmnt -T /mount1                                                                                                                │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │ 24 Nov 25 09:02 UTC │
	│ mount          │ -p functional-941011 /tmp/TestFunctionalparallelMountCmdVerifyCleanup1964745822/001:/mount2 --alsologtostderr -v=1                                      │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ mount          │ -p functional-941011 /tmp/TestFunctionalparallelMountCmdVerifyCleanup1964745822/001:/mount3 --alsologtostderr -v=1                                      │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ ssh            │ functional-941011 ssh findmnt -T /mount2                                                                                                                │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │ 24 Nov 25 09:02 UTC │
	│ ssh            │ functional-941011 ssh findmnt -T /mount3                                                                                                                │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │ 24 Nov 25 09:02 UTC │
	│ mount          │ -p functional-941011 --kill=true                                                                                                                        │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ start          │ -p functional-941011 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd                                         │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ start          │ -p functional-941011 --dry-run --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd                                                   │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ start          │ -p functional-941011 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd                                         │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ dashboard      │ --url --port 36195 -p functional-941011 --alsologtostderr -v=1                                                                                          │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ update-context │ functional-941011 update-context --alsologtostderr -v=2                                                                                                 │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ update-context │ functional-941011 update-context --alsologtostderr -v=2                                                                                                 │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ update-context │ functional-941011 update-context --alsologtostderr -v=2                                                                                                 │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ image          │ functional-941011 image ls --format short --alsologtostderr                                                                                             │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ image          │ functional-941011 image ls --format yaml --alsologtostderr                                                                                              │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ ssh            │ functional-941011 ssh pgrep buildkitd                                                                                                                   │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │                     │
	│ image          │ functional-941011 image build -t localhost/my-image:functional-941011 testdata/build --alsologtostderr                                                  │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ image          │ functional-941011 image ls                                                                                                                              │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ image          │ functional-941011 image ls --format json --alsologtostderr                                                                                              │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ image          │ functional-941011 image ls --format table --alsologtostderr                                                                                             │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ delete         │ -p functional-941011                                                                                                                                    │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:10 UTC │ 24 Nov 25 09:10 UTC │
	│ start          │ -p functional-291288 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0 │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:10 UTC │                     │
	│ start          │ -p functional-291288 --alsologtostderr -v=8                                                                                                             │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:19 UTC │                     │
	└────────────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/11/24 09:19:20
	Running on machine: ip-172-31-30-239
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1124 09:19:20.929895 1701291 out.go:360] Setting OutFile to fd 1 ...
	I1124 09:19:20.930102 1701291 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:19:20.930128 1701291 out.go:374] Setting ErrFile to fd 2...
	I1124 09:19:20.930149 1701291 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:19:20.930488 1701291 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
	I1124 09:19:20.930883 1701291 out.go:368] Setting JSON to false
	I1124 09:19:20.931751 1701291 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":28890,"bootTime":1763947071,"procs":158,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1124 09:19:20.931843 1701291 start.go:143] virtualization:  
	I1124 09:19:20.938521 1701291 out.go:179] * [functional-291288] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1124 09:19:20.941571 1701291 out.go:179]   - MINIKUBE_LOCATION=21978
	I1124 09:19:20.941660 1701291 notify.go:221] Checking for updates...
	I1124 09:19:20.947508 1701291 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1124 09:19:20.950282 1701291 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 09:19:20.953189 1701291 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21978-1652607/.minikube
	I1124 09:19:20.956068 1701291 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1124 09:19:20.958991 1701291 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1124 09:19:20.962273 1701291 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1124 09:19:20.962433 1701291 driver.go:422] Setting default libvirt URI to qemu:///system
	I1124 09:19:20.992476 1701291 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1124 09:19:20.992586 1701291 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 09:19:21.057666 1701291 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-11-24 09:19:21.047762616 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 09:19:21.057787 1701291 docker.go:319] overlay module found
	I1124 09:19:21.060830 1701291 out.go:179] * Using the docker driver based on existing profile
	I1124 09:19:21.063549 1701291 start.go:309] selected driver: docker
	I1124 09:19:21.063567 1701291 start.go:927] validating driver "docker" against &{Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disa
bleCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:19:21.063661 1701291 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1124 09:19:21.063775 1701291 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 09:19:21.121254 1701291 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-11-24 09:19:21.111151392 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 09:19:21.121789 1701291 cni.go:84] Creating CNI manager for ""
	I1124 09:19:21.121863 1701291 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1124 09:19:21.121942 1701291 start.go:353] cluster config:
	{Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local C
ontainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPa
th: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:19:21.125134 1701291 out.go:179] * Starting "functional-291288" primary control-plane node in "functional-291288" cluster
	I1124 09:19:21.127989 1701291 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1124 09:19:21.131005 1701291 out.go:179] * Pulling base image v0.0.48-1763789673-21948 ...
	I1124 09:19:21.133917 1701291 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f in local docker daemon
	I1124 09:19:21.133914 1701291 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1124 09:19:21.154192 1701291 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f in local docker daemon, skipping pull
	I1124 09:19:21.154216 1701291 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f exists in daemon, skipping load
	W1124 09:19:21.197477 1701291 preload.go:144] https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.35.0-beta.0/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	W1124 09:19:21.391690 1701291 preload.go:144] https://github.com/kubernetes-sigs/minikube-preloads/releases/download/v18/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	I1124 09:19:21.391947 1701291 profile.go:143] Saving config to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/config.json ...
	I1124 09:19:21.392070 1701291 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:19:21.392253 1701291 cache.go:243] Successfully downloaded all kic artifacts
	I1124 09:19:21.392304 1701291 start.go:360] acquireMachinesLock for functional-291288: {Name:mk85384dc057570e1f34db593d357cea738652c4 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.392403 1701291 start.go:364] duration metric: took 38.802µs to acquireMachinesLock for "functional-291288"
	I1124 09:19:21.392443 1701291 start.go:96] Skipping create...Using existing machine configuration
	I1124 09:19:21.392463 1701291 fix.go:54] fixHost starting: 
	I1124 09:19:21.392780 1701291 cli_runner.go:164] Run: docker container inspect functional-291288 --format={{.State.Status}}
	I1124 09:19:21.413220 1701291 fix.go:112] recreateIfNeeded on functional-291288: state=Running err=<nil>
	W1124 09:19:21.413254 1701291 fix.go:138] unexpected machine state, will restart: <nil>
	I1124 09:19:21.416439 1701291 out.go:252] * Updating the running docker "functional-291288" container ...
	I1124 09:19:21.416481 1701291 machine.go:94] provisionDockerMachine start ...
	I1124 09:19:21.416565 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:21.444143 1701291 main.go:143] libmachine: Using SSH client type: native
	I1124 09:19:21.444471 1701291 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34684 <nil> <nil>}
	I1124 09:19:21.444480 1701291 main.go:143] libmachine: About to run SSH command:
	hostname
	I1124 09:19:21.581815 1701291 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:19:21.598566 1701291 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-291288
	
	I1124 09:19:21.598592 1701291 ubuntu.go:182] provisioning hostname "functional-291288"
	I1124 09:19:21.598669 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:21.623443 1701291 main.go:143] libmachine: Using SSH client type: native
	I1124 09:19:21.623759 1701291 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34684 <nil> <nil>}
	I1124 09:19:21.623771 1701291 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-291288 && echo "functional-291288" | sudo tee /etc/hostname
	I1124 09:19:21.758572 1701291 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:19:21.799121 1701291 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-291288
	
	I1124 09:19:21.799200 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:21.831127 1701291 main.go:143] libmachine: Using SSH client type: native
	I1124 09:19:21.831435 1701291 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34684 <nil> <nil>}
	I1124 09:19:21.831451 1701291 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-291288' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-291288/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-291288' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1124 09:19:21.919264 1701291 cache.go:107] acquiring lock: {Name:mk22a10f0ce1f3295b61e7e76c455d0494a3e278 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.919300 1701291 cache.go:107] acquiring lock: {Name:mk1cf42e67442503a46c578224bd3cb68bf682d4 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.919365 1701291 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 exists
	I1124 09:19:21.919369 1701291 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 exists
	I1124 09:19:21.919375 1701291 cache.go:96] cache image "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0" took 74.126µs
	I1124 09:19:21.919377 1701291 cache.go:96] cache image "gcr.io/k8s-minikube/storage-provisioner:v5" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5" took 130.274µs
	I1124 09:19:21.919383 1701291 cache.go:80] save to tar file registry.k8s.io/kube-apiserver:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 succeeded
	I1124 09:19:21.919385 1701291 cache.go:80] save to tar file gcr.io/k8s-minikube/storage-provisioner:v5 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 succeeded
	I1124 09:19:21.919395 1701291 cache.go:107] acquiring lock: {Name:mkfdc49c8e68aee34cee0c9d441ae8a4dca675c6 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.919407 1701291 cache.go:107] acquiring lock: {Name:mk85f1502dbb97830776608fb729eb3605e112e6 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.919449 1701291 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 exists
	I1124 09:19:21.919433 1701291 cache.go:107] acquiring lock: {Name:mkdbf38e05e2c47c1a7a906a2236e9e7020a94c5 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.919454 1701291 cache.go:96] cache image "registry.k8s.io/pause:3.10.1" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1" took 48.764µs
	I1124 09:19:21.919460 1701291 cache.go:80] save to tar file registry.k8s.io/pause:3.10.1 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 succeeded
	I1124 09:19:21.919471 1701291 cache.go:107] acquiring lock: {Name:mk46ce3b59d7e062b3dbc8a90fe5b4231f256471 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.919266 1701291 cache.go:107] acquiring lock: {Name:mk80fdbe7cdb5bc17c2a82b4ecfd00214559a435 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.919495 1701291 cache.go:107] acquiring lock: {Name:mk726502cb84c177b2e14fee88512325761511c5 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.919506 1701291 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 exists
	I1124 09:19:21.919511 1701291 cache.go:96] cache image "registry.k8s.io/kube-proxy:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0" took 262.074µs
	I1124 09:19:21.919517 1701291 cache.go:80] save to tar file registry.k8s.io/kube-proxy:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 succeeded
	I1124 09:19:21.919425 1701291 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 exists
	I1124 09:19:21.919525 1701291 cache.go:96] cache image "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0" took 132.661µs
	I1124 09:19:21.919532 1701291 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 exists
	I1124 09:19:21.919476 1701291 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 exists
	I1124 09:19:21.919540 1701291 cache.go:96] cache image "registry.k8s.io/coredns/coredns:v1.13.1" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1" took 48.796µs
	I1124 09:19:21.919547 1701291 cache.go:80] save to tar file registry.k8s.io/coredns/coredns:v1.13.1 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 succeeded
	I1124 09:19:21.919541 1701291 cache.go:96] cache image "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0" took 109.4µs
	I1124 09:19:21.919553 1701291 cache.go:80] save to tar file registry.k8s.io/kube-scheduler:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 succeeded
	I1124 09:19:21.919533 1701291 cache.go:80] save to tar file registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 succeeded
	I1124 09:19:21.919557 1701291 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0 exists
	I1124 09:19:21.919563 1701291 cache.go:96] cache image "registry.k8s.io/etcd:3.5.24-0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0" took 93.482µs
	I1124 09:19:21.919568 1701291 cache.go:80] save to tar file registry.k8s.io/etcd:3.5.24-0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0 succeeded
	I1124 09:19:21.919582 1701291 cache.go:87] Successfully saved all images to host disk.
	I1124 09:19:21.982718 1701291 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1124 09:19:21.982799 1701291 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/21978-1652607/.minikube CaCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/21978-1652607/.minikube}
	I1124 09:19:21.982852 1701291 ubuntu.go:190] setting up certificates
	I1124 09:19:21.982880 1701291 provision.go:84] configureAuth start
	I1124 09:19:21.982954 1701291 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-291288
	I1124 09:19:22.001413 1701291 provision.go:143] copyHostCerts
	I1124 09:19:22.001464 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem
	I1124 09:19:22.001516 1701291 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem, removing ...
	I1124 09:19:22.001530 1701291 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem
	I1124 09:19:22.001614 1701291 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem (1078 bytes)
	I1124 09:19:22.001708 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem
	I1124 09:19:22.001726 1701291 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem, removing ...
	I1124 09:19:22.001731 1701291 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem
	I1124 09:19:22.001757 1701291 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem (1123 bytes)
	I1124 09:19:22.001795 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem
	I1124 09:19:22.001816 1701291 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem, removing ...
	I1124 09:19:22.001820 1701291 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem
	I1124 09:19:22.001845 1701291 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem (1679 bytes)
	I1124 09:19:22.001893 1701291 provision.go:117] generating server cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem org=jenkins.functional-291288 san=[127.0.0.1 192.168.49.2 functional-291288 localhost minikube]
	I1124 09:19:22.129571 1701291 provision.go:177] copyRemoteCerts
	I1124 09:19:22.129639 1701291 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1124 09:19:22.129681 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:22.147944 1701291 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:19:22.254207 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I1124 09:19:22.254271 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1124 09:19:22.271706 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem -> /etc/docker/server.pem
	I1124 09:19:22.271768 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1124 09:19:22.289262 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I1124 09:19:22.289325 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1124 09:19:22.306621 1701291 provision.go:87] duration metric: took 323.706379ms to configureAuth
	I1124 09:19:22.306647 1701291 ubuntu.go:206] setting minikube options for container-runtime
	I1124 09:19:22.306839 1701291 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1124 09:19:22.306847 1701291 machine.go:97] duration metric: took 890.360502ms to provisionDockerMachine
	I1124 09:19:22.306855 1701291 start.go:293] postStartSetup for "functional-291288" (driver="docker")
	I1124 09:19:22.306866 1701291 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1124 09:19:22.306912 1701291 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1124 09:19:22.306953 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:22.324012 1701291 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:19:22.434427 1701291 ssh_runner.go:195] Run: cat /etc/os-release
	I1124 09:19:22.437860 1701291 command_runner.go:130] > PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	I1124 09:19:22.437881 1701291 command_runner.go:130] > NAME="Debian GNU/Linux"
	I1124 09:19:22.437886 1701291 command_runner.go:130] > VERSION_ID="12"
	I1124 09:19:22.437890 1701291 command_runner.go:130] > VERSION="12 (bookworm)"
	I1124 09:19:22.437898 1701291 command_runner.go:130] > VERSION_CODENAME=bookworm
	I1124 09:19:22.437901 1701291 command_runner.go:130] > ID=debian
	I1124 09:19:22.437906 1701291 command_runner.go:130] > HOME_URL="https://www.debian.org/"
	I1124 09:19:22.437910 1701291 command_runner.go:130] > SUPPORT_URL="https://www.debian.org/support"
	I1124 09:19:22.437917 1701291 command_runner.go:130] > BUG_REPORT_URL="https://bugs.debian.org/"
	I1124 09:19:22.437980 1701291 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1124 09:19:22.437995 1701291 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1124 09:19:22.438006 1701291 filesync.go:126] Scanning /home/jenkins/minikube-integration/21978-1652607/.minikube/addons for local assets ...
	I1124 09:19:22.438064 1701291 filesync.go:126] Scanning /home/jenkins/minikube-integration/21978-1652607/.minikube/files for local assets ...
	I1124 09:19:22.438143 1701291 filesync.go:149] local asset: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem -> 16544672.pem in /etc/ssl/certs
	I1124 09:19:22.438150 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem -> /etc/ssl/certs/16544672.pem
	I1124 09:19:22.438232 1701291 filesync.go:149] local asset: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/test/nested/copy/1654467/hosts -> hosts in /etc/test/nested/copy/1654467
	I1124 09:19:22.438236 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/test/nested/copy/1654467/hosts -> /etc/test/nested/copy/1654467/hosts
	I1124 09:19:22.438277 1701291 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/1654467
	I1124 09:19:22.446265 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem --> /etc/ssl/certs/16544672.pem (1708 bytes)
	I1124 09:19:22.463769 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/test/nested/copy/1654467/hosts --> /etc/test/nested/copy/1654467/hosts (40 bytes)
	I1124 09:19:22.481365 1701291 start.go:296] duration metric: took 174.495413ms for postStartSetup
	I1124 09:19:22.481446 1701291 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1124 09:19:22.481495 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:22.498552 1701291 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:19:22.598952 1701291 command_runner.go:130] > 14%
	I1124 09:19:22.599551 1701291 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1124 09:19:22.604050 1701291 command_runner.go:130] > 168G
	I1124 09:19:22.604631 1701291 fix.go:56] duration metric: took 1.212164413s for fixHost
	I1124 09:19:22.604655 1701291 start.go:83] releasing machines lock for "functional-291288", held for 1.212220037s
	I1124 09:19:22.604753 1701291 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-291288
	I1124 09:19:22.621885 1701291 ssh_runner.go:195] Run: cat /version.json
	I1124 09:19:22.621944 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:22.622207 1701291 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1124 09:19:22.622270 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:22.640397 1701291 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:19:22.648463 1701291 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:19:22.746016 1701291 command_runner.go:130] > {"iso_version": "v1.37.0-1763503576-21924", "kicbase_version": "v0.0.48-1763789673-21948", "minikube_version": "v1.37.0", "commit": "2996c7ec74d570fa8ab37e6f4f8813150d0c7473"}
	I1124 09:19:22.746158 1701291 ssh_runner.go:195] Run: systemctl --version
	I1124 09:19:22.840219 1701291 command_runner.go:130] > <a href="https://github.com/kubernetes/registry.k8s.io">Temporary Redirect</a>.
	I1124 09:19:22.840264 1701291 command_runner.go:130] > systemd 252 (252.39-1~deb12u1)
	I1124 09:19:22.840285 1701291 command_runner.go:130] > +PAM +AUDIT +SELINUX +APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT +QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified
	I1124 09:19:22.840354 1701291 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I1124 09:19:22.844675 1701291 command_runner.go:130] ! stat: cannot statx '/etc/cni/net.d/*loopback.conf*': No such file or directory
	W1124 09:19:22.844725 1701291 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1124 09:19:22.844793 1701291 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1124 09:19:22.852461 1701291 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1124 09:19:22.852484 1701291 start.go:496] detecting cgroup driver to use...
	I1124 09:19:22.852517 1701291 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1124 09:19:22.852584 1701291 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1124 09:19:22.868240 1701291 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1124 09:19:22.881367 1701291 docker.go:218] disabling cri-docker service (if available) ...
	I1124 09:19:22.881470 1701291 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1124 09:19:22.896889 1701291 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1124 09:19:22.910017 1701291 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1124 09:19:23.028071 1701291 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1124 09:19:23.171419 1701291 docker.go:234] disabling docker service ...
	I1124 09:19:23.171539 1701291 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1124 09:19:23.187505 1701291 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1124 09:19:23.201405 1701291 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1124 09:19:23.324426 1701291 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1124 09:19:23.445186 1701291 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1124 09:19:23.457903 1701291 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1124 09:19:23.470553 1701291 command_runner.go:130] > runtime-endpoint: unix:///run/containerd/containerd.sock
	I1124 09:19:23.472034 1701291 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:19:23.623898 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1124 09:19:23.632988 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1124 09:19:23.641976 1701291 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1124 09:19:23.642063 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1124 09:19:23.651244 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1124 09:19:23.660198 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1124 09:19:23.668706 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1124 09:19:23.677261 1701291 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1124 09:19:23.685600 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1124 09:19:23.694593 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1124 09:19:23.703191 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1124 09:19:23.712006 1701291 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1124 09:19:23.718640 1701291 command_runner.go:130] > net.bridge.bridge-nf-call-iptables = 1
	I1124 09:19:23.719691 1701291 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1124 09:19:23.727172 1701291 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1124 09:19:23.844539 1701291 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1124 09:19:23.964625 1701291 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1124 09:19:23.964708 1701291 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1124 09:19:23.969624 1701291 command_runner.go:130] >   File: /run/containerd/containerd.sock
	I1124 09:19:23.969648 1701291 command_runner.go:130] >   Size: 0         	Blocks: 0          IO Block: 4096   socket
	I1124 09:19:23.969655 1701291 command_runner.go:130] > Device: 0,72	Inode: 1619        Links: 1
	I1124 09:19:23.969671 1701291 command_runner.go:130] > Access: (0660/srw-rw----)  Uid: (    0/    root)   Gid: (    0/    root)
	I1124 09:19:23.969685 1701291 command_runner.go:130] > Access: 2025-11-24 09:19:23.931843190 +0000
	I1124 09:19:23.969693 1701291 command_runner.go:130] > Modify: 2025-11-24 09:19:23.931843190 +0000
	I1124 09:19:23.969699 1701291 command_runner.go:130] > Change: 2025-11-24 09:19:23.931843190 +0000
	I1124 09:19:23.969707 1701291 command_runner.go:130] >  Birth: -
	I1124 09:19:23.970283 1701291 start.go:564] Will wait 60s for crictl version
	I1124 09:19:23.970345 1701291 ssh_runner.go:195] Run: which crictl
	I1124 09:19:23.973724 1701291 command_runner.go:130] > /usr/local/bin/crictl
	I1124 09:19:23.974288 1701291 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1124 09:19:23.995301 1701291 command_runner.go:130] > Version:  0.1.0
	I1124 09:19:23.995587 1701291 command_runner.go:130] > RuntimeName:  containerd
	I1124 09:19:23.995841 1701291 command_runner.go:130] > RuntimeVersion:  v2.1.5
	I1124 09:19:23.996049 1701291 command_runner.go:130] > RuntimeApiVersion:  v1
	I1124 09:19:23.998158 1701291 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.1.5
	RuntimeApiVersion:  v1
	I1124 09:19:23.998238 1701291 ssh_runner.go:195] Run: containerd --version
	I1124 09:19:24.020107 1701291 command_runner.go:130] > containerd containerd.io v2.1.5 fcd43222d6b07379a4be9786bda52438f0dd16a1
	I1124 09:19:24.020449 1701291 ssh_runner.go:195] Run: containerd --version
	I1124 09:19:24.041776 1701291 command_runner.go:130] > containerd containerd.io v2.1.5 fcd43222d6b07379a4be9786bda52438f0dd16a1
	I1124 09:19:24.047417 1701291 out.go:179] * Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	I1124 09:19:24.050497 1701291 cli_runner.go:164] Run: docker network inspect functional-291288 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1124 09:19:24.067531 1701291 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1124 09:19:24.071507 1701291 command_runner.go:130] > 192.168.49.1	host.minikube.internal
	I1124 09:19:24.071622 1701291 kubeadm.go:884] updating cluster {Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikub
eCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:fal
se CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1124 09:19:24.071797 1701291 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:19:24.253230 1701291 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:19:24.402285 1701291 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:19:24.552419 1701291 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1124 09:19:24.552515 1701291 ssh_runner.go:195] Run: sudo crictl images --output json
	I1124 09:19:24.577200 1701291 command_runner.go:130] > {
	I1124 09:19:24.577221 1701291 command_runner.go:130] >   "images":  [
	I1124 09:19:24.577226 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577235 1701291 command_runner.go:130] >       "id":  "sha256:66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51",
	I1124 09:19:24.577240 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577245 1701291 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1124 09:19:24.577248 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577252 1701291 command_runner.go:130] >       "repoDigests":  [],
	I1124 09:19:24.577256 1701291 command_runner.go:130] >       "size":  "8032639",
	I1124 09:19:24.577264 1701291 command_runner.go:130] >       "username":  "",
	I1124 09:19:24.577269 1701291 command_runner.go:130] >       "pinned":  false
	I1124 09:19:24.577272 1701291 command_runner.go:130] >     },
	I1124 09:19:24.577276 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577283 1701291 command_runner.go:130] >       "id":  "sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1124 09:19:24.577290 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577296 1701291 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1124 09:19:24.577299 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577308 1701291 command_runner.go:130] >       "repoDigests":  [],
	I1124 09:19:24.577330 1701291 command_runner.go:130] >       "size":  "21166088",
	I1124 09:19:24.577335 1701291 command_runner.go:130] >       "username":  "nonroot",
	I1124 09:19:24.577339 1701291 command_runner.go:130] >       "pinned":  false
	I1124 09:19:24.577349 1701291 command_runner.go:130] >     },
	I1124 09:19:24.577357 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577364 1701291 command_runner.go:130] >       "id":  "sha256:1211402d28f5813ed906916bfcdd0a7404c2f9048ef5bb54387a6745bc410eca",
	I1124 09:19:24.577368 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577373 1701291 command_runner.go:130] >         "registry.k8s.io/etcd:3.5.24-0"
	I1124 09:19:24.577376 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577380 1701291 command_runner.go:130] >       "repoDigests":  [],
	I1124 09:19:24.577384 1701291 command_runner.go:130] >       "size":  "21880804",
	I1124 09:19:24.577391 1701291 command_runner.go:130] >       "uid":  {
	I1124 09:19:24.577395 1701291 command_runner.go:130] >         "value":  "0"
	I1124 09:19:24.577400 1701291 command_runner.go:130] >       },
	I1124 09:19:24.577404 1701291 command_runner.go:130] >       "username":  "",
	I1124 09:19:24.577408 1701291 command_runner.go:130] >       "pinned":  false
	I1124 09:19:24.577421 1701291 command_runner.go:130] >     },
	I1124 09:19:24.577424 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577431 1701291 command_runner.go:130] >       "id":  "sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42",
	I1124 09:19:24.577434 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577443 1701291 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.5-0"
	I1124 09:19:24.577450 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577454 1701291 command_runner.go:130] >       "repoDigests":  [
	I1124 09:19:24.577461 1701291 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534"
	I1124 09:19:24.577465 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577469 1701291 command_runner.go:130] >       "size":  "21136588",
	I1124 09:19:24.577472 1701291 command_runner.go:130] >       "uid":  {
	I1124 09:19:24.577479 1701291 command_runner.go:130] >         "value":  "0"
	I1124 09:19:24.577482 1701291 command_runner.go:130] >       },
	I1124 09:19:24.577486 1701291 command_runner.go:130] >       "username":  "",
	I1124 09:19:24.577492 1701291 command_runner.go:130] >       "pinned":  false
	I1124 09:19:24.577495 1701291 command_runner.go:130] >     },
	I1124 09:19:24.577502 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577512 1701291 command_runner.go:130] >       "id":  "sha256:ccd634d9bcc36ac6235e9c86676cd3a02c06afc3788a25f1bbf39ca7d44585f4",
	I1124 09:19:24.577516 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577521 1701291 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-beta.0"
	I1124 09:19:24.577527 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577531 1701291 command_runner.go:130] >       "repoDigests":  [],
	I1124 09:19:24.577535 1701291 command_runner.go:130] >       "size":  "24676285",
	I1124 09:19:24.577538 1701291 command_runner.go:130] >       "uid":  {
	I1124 09:19:24.577541 1701291 command_runner.go:130] >         "value":  "0"
	I1124 09:19:24.577545 1701291 command_runner.go:130] >       },
	I1124 09:19:24.577550 1701291 command_runner.go:130] >       "username":  "",
	I1124 09:19:24.577556 1701291 command_runner.go:130] >       "pinned":  false
	I1124 09:19:24.577560 1701291 command_runner.go:130] >     },
	I1124 09:19:24.577563 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577569 1701291 command_runner.go:130] >       "id":  "sha256:68b5f775f18769fcb77bd8474c80bda2050163b6c66f4551f352b7381b8ca5be",
	I1124 09:19:24.577581 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577586 1701291 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0"
	I1124 09:19:24.577590 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577594 1701291 command_runner.go:130] >       "repoDigests":  [],
	I1124 09:19:24.577605 1701291 command_runner.go:130] >       "size":  "20658969",
	I1124 09:19:24.577608 1701291 command_runner.go:130] >       "uid":  {
	I1124 09:19:24.577612 1701291 command_runner.go:130] >         "value":  "0"
	I1124 09:19:24.577615 1701291 command_runner.go:130] >       },
	I1124 09:19:24.577619 1701291 command_runner.go:130] >       "username":  "",
	I1124 09:19:24.577624 1701291 command_runner.go:130] >       "pinned":  false
	I1124 09:19:24.577629 1701291 command_runner.go:130] >     },
	I1124 09:19:24.577633 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577644 1701291 command_runner.go:130] >       "id":  "sha256:404c2e12861777b763b8feaa316d36680fc68ad308a8d2f6e55f1bb981cdd904",
	I1124 09:19:24.577655 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577660 1701291 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-beta.0"
	I1124 09:19:24.577663 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577667 1701291 command_runner.go:130] >       "repoDigests":  [],
	I1124 09:19:24.577678 1701291 command_runner.go:130] >       "size":  "22428165",
	I1124 09:19:24.577686 1701291 command_runner.go:130] >       "username":  "",
	I1124 09:19:24.577692 1701291 command_runner.go:130] >       "pinned":  false
	I1124 09:19:24.577696 1701291 command_runner.go:130] >     },
	I1124 09:19:24.577706 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577712 1701291 command_runner.go:130] >       "id":  "sha256:16378741539f1be9c6e347d127537d379a6592587b09b4eb47964cb5c43a409b",
	I1124 09:19:24.577716 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577721 1701291 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-beta.0"
	I1124 09:19:24.577724 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577728 1701291 command_runner.go:130] >       "repoDigests":  [],
	I1124 09:19:24.577738 1701291 command_runner.go:130] >       "size":  "15389290",
	I1124 09:19:24.577744 1701291 command_runner.go:130] >       "uid":  {
	I1124 09:19:24.577751 1701291 command_runner.go:130] >         "value":  "0"
	I1124 09:19:24.577754 1701291 command_runner.go:130] >       },
	I1124 09:19:24.577758 1701291 command_runner.go:130] >       "username":  "",
	I1124 09:19:24.577762 1701291 command_runner.go:130] >       "pinned":  false
	I1124 09:19:24.577768 1701291 command_runner.go:130] >     },
	I1124 09:19:24.577771 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577779 1701291 command_runner.go:130] >       "id":  "sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1124 09:19:24.577786 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577791 1701291 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1124 09:19:24.577794 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577797 1701291 command_runner.go:130] >       "repoDigests":  [],
	I1124 09:19:24.577801 1701291 command_runner.go:130] >       "size":  "265458",
	I1124 09:19:24.577805 1701291 command_runner.go:130] >       "uid":  {
	I1124 09:19:24.577809 1701291 command_runner.go:130] >         "value":  "65535"
	I1124 09:19:24.577815 1701291 command_runner.go:130] >       },
	I1124 09:19:24.577819 1701291 command_runner.go:130] >       "username":  "",
	I1124 09:19:24.577824 1701291 command_runner.go:130] >       "pinned":  true
	I1124 09:19:24.577827 1701291 command_runner.go:130] >     }
	I1124 09:19:24.577831 1701291 command_runner.go:130] >   ]
	I1124 09:19:24.577842 1701291 command_runner.go:130] > }
	I1124 09:19:24.577988 1701291 containerd.go:627] all images are preloaded for containerd runtime.
	I1124 09:19:24.578000 1701291 cache_images.go:86] Images are preloaded, skipping loading
	I1124 09:19:24.578012 1701291 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-beta.0 containerd true true} ...
	I1124 09:19:24.578111 1701291 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-291288 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1124 09:19:24.578176 1701291 ssh_runner.go:195] Run: sudo crictl info
	I1124 09:19:24.601872 1701291 command_runner.go:130] > {
	I1124 09:19:24.601895 1701291 command_runner.go:130] >   "cniconfig": {
	I1124 09:19:24.601901 1701291 command_runner.go:130] >     "Networks": [
	I1124 09:19:24.601905 1701291 command_runner.go:130] >       {
	I1124 09:19:24.601909 1701291 command_runner.go:130] >         "Config": {
	I1124 09:19:24.601914 1701291 command_runner.go:130] >           "CNIVersion": "0.3.1",
	I1124 09:19:24.601919 1701291 command_runner.go:130] >           "Name": "cni-loopback",
	I1124 09:19:24.601924 1701291 command_runner.go:130] >           "Plugins": [
	I1124 09:19:24.601927 1701291 command_runner.go:130] >             {
	I1124 09:19:24.601931 1701291 command_runner.go:130] >               "Network": {
	I1124 09:19:24.601935 1701291 command_runner.go:130] >                 "ipam": {},
	I1124 09:19:24.601941 1701291 command_runner.go:130] >                 "type": "loopback"
	I1124 09:19:24.601945 1701291 command_runner.go:130] >               },
	I1124 09:19:24.601958 1701291 command_runner.go:130] >               "Source": "{\"type\":\"loopback\"}"
	I1124 09:19:24.601965 1701291 command_runner.go:130] >             }
	I1124 09:19:24.601969 1701291 command_runner.go:130] >           ],
	I1124 09:19:24.601979 1701291 command_runner.go:130] >           "Source": "{\n\"cniVersion\": \"0.3.1\",\n\"name\": \"cni-loopback\",\n\"plugins\": [{\n  \"type\": \"loopback\"\n}]\n}"
	I1124 09:19:24.601983 1701291 command_runner.go:130] >         },
	I1124 09:19:24.601991 1701291 command_runner.go:130] >         "IFName": "lo"
	I1124 09:19:24.601994 1701291 command_runner.go:130] >       }
	I1124 09:19:24.601997 1701291 command_runner.go:130] >     ],
	I1124 09:19:24.602003 1701291 command_runner.go:130] >     "PluginConfDir": "/etc/cni/net.d",
	I1124 09:19:24.602007 1701291 command_runner.go:130] >     "PluginDirs": [
	I1124 09:19:24.602014 1701291 command_runner.go:130] >       "/opt/cni/bin"
	I1124 09:19:24.602018 1701291 command_runner.go:130] >     ],
	I1124 09:19:24.602026 1701291 command_runner.go:130] >     "PluginMaxConfNum": 1,
	I1124 09:19:24.602030 1701291 command_runner.go:130] >     "Prefix": "eth"
	I1124 09:19:24.602033 1701291 command_runner.go:130] >   },
	I1124 09:19:24.602037 1701291 command_runner.go:130] >   "config": {
	I1124 09:19:24.602041 1701291 command_runner.go:130] >     "cdiSpecDirs": [
	I1124 09:19:24.602048 1701291 command_runner.go:130] >       "/etc/cdi",
	I1124 09:19:24.602051 1701291 command_runner.go:130] >       "/var/run/cdi"
	I1124 09:19:24.602055 1701291 command_runner.go:130] >     ],
	I1124 09:19:24.602069 1701291 command_runner.go:130] >     "cni": {
	I1124 09:19:24.602073 1701291 command_runner.go:130] >       "binDir": "",
	I1124 09:19:24.602076 1701291 command_runner.go:130] >       "binDirs": [
	I1124 09:19:24.602080 1701291 command_runner.go:130] >         "/opt/cni/bin"
	I1124 09:19:24.602083 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.602087 1701291 command_runner.go:130] >       "confDir": "/etc/cni/net.d",
	I1124 09:19:24.602092 1701291 command_runner.go:130] >       "confTemplate": "",
	I1124 09:19:24.602098 1701291 command_runner.go:130] >       "ipPref": "",
	I1124 09:19:24.602103 1701291 command_runner.go:130] >       "maxConfNum": 1,
	I1124 09:19:24.602109 1701291 command_runner.go:130] >       "setupSerially": false,
	I1124 09:19:24.602114 1701291 command_runner.go:130] >       "useInternalLoopback": false
	I1124 09:19:24.602120 1701291 command_runner.go:130] >     },
	I1124 09:19:24.602126 1701291 command_runner.go:130] >     "containerd": {
	I1124 09:19:24.602132 1701291 command_runner.go:130] >       "defaultRuntimeName": "runc",
	I1124 09:19:24.602137 1701291 command_runner.go:130] >       "ignoreBlockIONotEnabledErrors": false,
	I1124 09:19:24.602145 1701291 command_runner.go:130] >       "ignoreRdtNotEnabledErrors": false,
	I1124 09:19:24.602149 1701291 command_runner.go:130] >       "runtimes": {
	I1124 09:19:24.602152 1701291 command_runner.go:130] >         "runc": {
	I1124 09:19:24.602157 1701291 command_runner.go:130] >           "ContainerAnnotations": null,
	I1124 09:19:24.602163 1701291 command_runner.go:130] >           "PodAnnotations": null,
	I1124 09:19:24.602169 1701291 command_runner.go:130] >           "baseRuntimeSpec": "",
	I1124 09:19:24.602174 1701291 command_runner.go:130] >           "cgroupWritable": false,
	I1124 09:19:24.602179 1701291 command_runner.go:130] >           "cniConfDir": "",
	I1124 09:19:24.602185 1701291 command_runner.go:130] >           "cniMaxConfNum": 0,
	I1124 09:19:24.602190 1701291 command_runner.go:130] >           "io_type": "",
	I1124 09:19:24.602195 1701291 command_runner.go:130] >           "options": {
	I1124 09:19:24.602200 1701291 command_runner.go:130] >             "BinaryName": "",
	I1124 09:19:24.602212 1701291 command_runner.go:130] >             "CriuImagePath": "",
	I1124 09:19:24.602217 1701291 command_runner.go:130] >             "CriuWorkPath": "",
	I1124 09:19:24.602221 1701291 command_runner.go:130] >             "IoGid": 0,
	I1124 09:19:24.602226 1701291 command_runner.go:130] >             "IoUid": 0,
	I1124 09:19:24.602232 1701291 command_runner.go:130] >             "NoNewKeyring": false,
	I1124 09:19:24.602237 1701291 command_runner.go:130] >             "Root": "",
	I1124 09:19:24.602243 1701291 command_runner.go:130] >             "ShimCgroup": "",
	I1124 09:19:24.602248 1701291 command_runner.go:130] >             "SystemdCgroup": false
	I1124 09:19:24.602252 1701291 command_runner.go:130] >           },
	I1124 09:19:24.602257 1701291 command_runner.go:130] >           "privileged_without_host_devices": false,
	I1124 09:19:24.602266 1701291 command_runner.go:130] >           "privileged_without_host_devices_all_devices_allowed": false,
	I1124 09:19:24.602272 1701291 command_runner.go:130] >           "runtimePath": "",
	I1124 09:19:24.602278 1701291 command_runner.go:130] >           "runtimeType": "io.containerd.runc.v2",
	I1124 09:19:24.602285 1701291 command_runner.go:130] >           "sandboxer": "podsandbox",
	I1124 09:19:24.602290 1701291 command_runner.go:130] >           "snapshotter": ""
	I1124 09:19:24.602293 1701291 command_runner.go:130] >         }
	I1124 09:19:24.602296 1701291 command_runner.go:130] >       }
	I1124 09:19:24.602299 1701291 command_runner.go:130] >     },
	I1124 09:19:24.602309 1701291 command_runner.go:130] >     "containerdEndpoint": "/run/containerd/containerd.sock",
	I1124 09:19:24.602332 1701291 command_runner.go:130] >     "containerdRootDir": "/var/lib/containerd",
	I1124 09:19:24.602339 1701291 command_runner.go:130] >     "device_ownership_from_security_context": false,
	I1124 09:19:24.602344 1701291 command_runner.go:130] >     "disableApparmor": false,
	I1124 09:19:24.602351 1701291 command_runner.go:130] >     "disableHugetlbController": true,
	I1124 09:19:24.602355 1701291 command_runner.go:130] >     "disableProcMount": false,
	I1124 09:19:24.602362 1701291 command_runner.go:130] >     "drainExecSyncIOTimeout": "0s",
	I1124 09:19:24.602366 1701291 command_runner.go:130] >     "enableCDI": true,
	I1124 09:19:24.602378 1701291 command_runner.go:130] >     "enableSelinux": false,
	I1124 09:19:24.602382 1701291 command_runner.go:130] >     "enableUnprivilegedICMP": true,
	I1124 09:19:24.602387 1701291 command_runner.go:130] >     "enableUnprivilegedPorts": true,
	I1124 09:19:24.602392 1701291 command_runner.go:130] >     "ignoreDeprecationWarnings": null,
	I1124 09:19:24.602403 1701291 command_runner.go:130] >     "ignoreImageDefinedVolumes": false,
	I1124 09:19:24.602408 1701291 command_runner.go:130] >     "maxContainerLogLineSize": 16384,
	I1124 09:19:24.602413 1701291 command_runner.go:130] >     "netnsMountsUnderStateDir": false,
	I1124 09:19:24.602417 1701291 command_runner.go:130] >     "restrictOOMScoreAdj": false,
	I1124 09:19:24.602422 1701291 command_runner.go:130] >     "rootDir": "/var/lib/containerd/io.containerd.grpc.v1.cri",
	I1124 09:19:24.602427 1701291 command_runner.go:130] >     "selinuxCategoryRange": 1024,
	I1124 09:19:24.602432 1701291 command_runner.go:130] >     "stateDir": "/run/containerd/io.containerd.grpc.v1.cri",
	I1124 09:19:24.602438 1701291 command_runner.go:130] >     "tolerateMissingHugetlbController": true,
	I1124 09:19:24.602441 1701291 command_runner.go:130] >     "unsetSeccompProfile": ""
	I1124 09:19:24.602445 1701291 command_runner.go:130] >   },
	I1124 09:19:24.602449 1701291 command_runner.go:130] >   "features": {
	I1124 09:19:24.602492 1701291 command_runner.go:130] >     "supplemental_groups_policy": true
	I1124 09:19:24.602500 1701291 command_runner.go:130] >   },
	I1124 09:19:24.602504 1701291 command_runner.go:130] >   "golang": "go1.24.9",
	I1124 09:19:24.602513 1701291 command_runner.go:130] >   "lastCNILoadStatus": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1124 09:19:24.602527 1701291 command_runner.go:130] >   "lastCNILoadStatus.default": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1124 09:19:24.602532 1701291 command_runner.go:130] >   "runtimeHandlers": [
	I1124 09:19:24.602537 1701291 command_runner.go:130] >     {
	I1124 09:19:24.602541 1701291 command_runner.go:130] >       "features": {
	I1124 09:19:24.602546 1701291 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1124 09:19:24.602550 1701291 command_runner.go:130] >         "user_namespaces": true
	I1124 09:19:24.602555 1701291 command_runner.go:130] >       }
	I1124 09:19:24.602564 1701291 command_runner.go:130] >     },
	I1124 09:19:24.602570 1701291 command_runner.go:130] >     {
	I1124 09:19:24.602575 1701291 command_runner.go:130] >       "features": {
	I1124 09:19:24.602587 1701291 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1124 09:19:24.602592 1701291 command_runner.go:130] >         "user_namespaces": true
	I1124 09:19:24.602595 1701291 command_runner.go:130] >       },
	I1124 09:19:24.602598 1701291 command_runner.go:130] >       "name": "runc"
	I1124 09:19:24.602609 1701291 command_runner.go:130] >     }
	I1124 09:19:24.602612 1701291 command_runner.go:130] >   ],
	I1124 09:19:24.602615 1701291 command_runner.go:130] >   "status": {
	I1124 09:19:24.602619 1701291 command_runner.go:130] >     "conditions": [
	I1124 09:19:24.602623 1701291 command_runner.go:130] >       {
	I1124 09:19:24.602629 1701291 command_runner.go:130] >         "message": "",
	I1124 09:19:24.602633 1701291 command_runner.go:130] >         "reason": "",
	I1124 09:19:24.602637 1701291 command_runner.go:130] >         "status": true,
	I1124 09:19:24.602641 1701291 command_runner.go:130] >         "type": "RuntimeReady"
	I1124 09:19:24.602645 1701291 command_runner.go:130] >       },
	I1124 09:19:24.602648 1701291 command_runner.go:130] >       {
	I1124 09:19:24.602655 1701291 command_runner.go:130] >         "message": "Network plugin returns error: cni plugin not initialized",
	I1124 09:19:24.602662 1701291 command_runner.go:130] >         "reason": "NetworkPluginNotReady",
	I1124 09:19:24.602666 1701291 command_runner.go:130] >         "status": false,
	I1124 09:19:24.602678 1701291 command_runner.go:130] >         "type": "NetworkReady"
	I1124 09:19:24.602682 1701291 command_runner.go:130] >       },
	I1124 09:19:24.602685 1701291 command_runner.go:130] >       {
	I1124 09:19:24.602688 1701291 command_runner.go:130] >         "message": "",
	I1124 09:19:24.602692 1701291 command_runner.go:130] >         "reason": "",
	I1124 09:19:24.602703 1701291 command_runner.go:130] >         "status": true,
	I1124 09:19:24.602709 1701291 command_runner.go:130] >         "type": "ContainerdHasNoDeprecationWarnings"
	I1124 09:19:24.602712 1701291 command_runner.go:130] >       }
	I1124 09:19:24.602715 1701291 command_runner.go:130] >     ]
	I1124 09:19:24.602718 1701291 command_runner.go:130] >   }
	I1124 09:19:24.602721 1701291 command_runner.go:130] > }
	I1124 09:19:24.603033 1701291 cni.go:84] Creating CNI manager for ""
	I1124 09:19:24.603051 1701291 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1124 09:19:24.603074 1701291 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1124 09:19:24.603102 1701291 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-beta.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-291288 NodeName:functional-291288 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt Sta
ticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1124 09:19:24.603228 1701291 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-291288"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-beta.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1124 09:19:24.603309 1701291 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0
	I1124 09:19:24.611119 1701291 command_runner.go:130] > kubeadm
	I1124 09:19:24.611140 1701291 command_runner.go:130] > kubectl
	I1124 09:19:24.611146 1701291 command_runner.go:130] > kubelet
	I1124 09:19:24.611161 1701291 binaries.go:51] Found k8s binaries, skipping transfer
	I1124 09:19:24.611223 1701291 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1124 09:19:24.618883 1701291 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (328 bytes)
	I1124 09:19:24.633448 1701291 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (359 bytes)
	I1124 09:19:24.650072 1701291 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2237 bytes)
	I1124 09:19:24.664688 1701291 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1124 09:19:24.668362 1701291 command_runner.go:130] > 192.168.49.2	control-plane.minikube.internal
	I1124 09:19:24.668996 1701291 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1124 09:19:24.787731 1701291 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1124 09:19:25.630718 1701291 certs.go:69] Setting up /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288 for IP: 192.168.49.2
	I1124 09:19:25.630736 1701291 certs.go:195] generating shared ca certs ...
	I1124 09:19:25.630751 1701291 certs.go:227] acquiring lock for ca certs: {Name:mkbe540a30c4376a351176f7fe6fec044d058b09 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 09:19:25.630878 1701291 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.key
	I1124 09:19:25.630932 1701291 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.key
	I1124 09:19:25.630939 1701291 certs.go:257] generating profile certs ...
	I1124 09:19:25.631060 1701291 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.key
	I1124 09:19:25.631119 1701291 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.key.5acb2515
	I1124 09:19:25.631156 1701291 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.key
	I1124 09:19:25.631166 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I1124 09:19:25.631180 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I1124 09:19:25.631190 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I1124 09:19:25.631200 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I1124 09:19:25.631210 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I1124 09:19:25.631221 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I1124 09:19:25.631231 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I1124 09:19:25.631241 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I1124 09:19:25.631304 1701291 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467.pem (1338 bytes)
	W1124 09:19:25.631338 1701291 certs.go:480] ignoring /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467_empty.pem, impossibly tiny 0 bytes
	I1124 09:19:25.631352 1701291 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem (1671 bytes)
	I1124 09:19:25.631382 1701291 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem (1078 bytes)
	I1124 09:19:25.631410 1701291 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem (1123 bytes)
	I1124 09:19:25.631434 1701291 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem (1679 bytes)
	I1124 09:19:25.631484 1701291 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem (1708 bytes)
	I1124 09:19:25.631512 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:19:25.631529 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467.pem -> /usr/share/ca-certificates/1654467.pem
	I1124 09:19:25.631542 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem -> /usr/share/ca-certificates/16544672.pem
	I1124 09:19:25.632117 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1124 09:19:25.653566 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1124 09:19:25.672677 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1124 09:19:25.692448 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1124 09:19:25.712758 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1124 09:19:25.730246 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1124 09:19:25.748136 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1124 09:19:25.765102 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1124 09:19:25.782676 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1124 09:19:25.800418 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467.pem --> /usr/share/ca-certificates/1654467.pem (1338 bytes)
	I1124 09:19:25.818179 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem --> /usr/share/ca-certificates/16544672.pem (1708 bytes)
	I1124 09:19:25.836420 1701291 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1124 09:19:25.849273 1701291 ssh_runner.go:195] Run: openssl version
	I1124 09:19:25.855675 1701291 command_runner.go:130] > OpenSSL 3.0.17 1 Jul 2025 (Library: OpenSSL 3.0.17 1 Jul 2025)
	I1124 09:19:25.855803 1701291 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1124 09:19:25.864243 1701291 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:19:25.867919 1701291 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Nov 24 08:44 /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:19:25.867982 1701291 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Nov 24 08:44 /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:19:25.868042 1701291 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:19:25.908611 1701291 command_runner.go:130] > b5213941
	I1124 09:19:25.909123 1701291 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1124 09:19:25.916880 1701291 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1654467.pem && ln -fs /usr/share/ca-certificates/1654467.pem /etc/ssl/certs/1654467.pem"
	I1124 09:19:25.925097 1701291 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1654467.pem
	I1124 09:19:25.928711 1701291 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Nov 24 09:10 /usr/share/ca-certificates/1654467.pem
	I1124 09:19:25.928823 1701291 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Nov 24 09:10 /usr/share/ca-certificates/1654467.pem
	I1124 09:19:25.928900 1701291 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1654467.pem
	I1124 09:19:25.969833 1701291 command_runner.go:130] > 51391683
	I1124 09:19:25.970298 1701291 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1654467.pem /etc/ssl/certs/51391683.0"
	I1124 09:19:25.978202 1701291 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/16544672.pem && ln -fs /usr/share/ca-certificates/16544672.pem /etc/ssl/certs/16544672.pem"
	I1124 09:19:25.986297 1701291 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/16544672.pem
	I1124 09:19:25.989958 1701291 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Nov 24 09:10 /usr/share/ca-certificates/16544672.pem
	I1124 09:19:25.990028 1701291 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Nov 24 09:10 /usr/share/ca-certificates/16544672.pem
	I1124 09:19:25.990094 1701291 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/16544672.pem
	I1124 09:19:26.030947 1701291 command_runner.go:130] > 3ec20f2e
	I1124 09:19:26.031428 1701291 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/16544672.pem /etc/ssl/certs/3ec20f2e.0"
	I1124 09:19:26.039972 1701291 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1124 09:19:26.043966 1701291 command_runner.go:130] >   File: /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1124 09:19:26.043995 1701291 command_runner.go:130] >   Size: 1176      	Blocks: 8          IO Block: 4096   regular file
	I1124 09:19:26.044001 1701291 command_runner.go:130] > Device: 259,1	Inode: 1320367     Links: 1
	I1124 09:19:26.044008 1701291 command_runner.go:130] > Access: (0644/-rw-r--r--)  Uid: (    0/    root)   Gid: (    0/    root)
	I1124 09:19:26.044023 1701291 command_runner.go:130] > Access: 2025-11-24 09:15:17.409446871 +0000
	I1124 09:19:26.044028 1701291 command_runner.go:130] > Modify: 2025-11-24 09:11:12.722825550 +0000
	I1124 09:19:26.044034 1701291 command_runner.go:130] > Change: 2025-11-24 09:11:12.722825550 +0000
	I1124 09:19:26.044039 1701291 command_runner.go:130] >  Birth: 2025-11-24 09:11:12.722825550 +0000
	I1124 09:19:26.044132 1701291 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1124 09:19:26.086676 1701291 command_runner.go:130] > Certificate will not expire
	I1124 09:19:26.086876 1701291 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1124 09:19:26.129915 1701291 command_runner.go:130] > Certificate will not expire
	I1124 09:19:26.130020 1701291 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1124 09:19:26.173544 1701291 command_runner.go:130] > Certificate will not expire
	I1124 09:19:26.174084 1701291 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1124 09:19:26.214370 1701291 command_runner.go:130] > Certificate will not expire
	I1124 09:19:26.214874 1701291 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1124 09:19:26.257535 1701291 command_runner.go:130] > Certificate will not expire
	I1124 09:19:26.257999 1701291 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1124 09:19:26.298467 1701291 command_runner.go:130] > Certificate will not expire
	I1124 09:19:26.298937 1701291 kubeadm.go:401] StartCluster: {Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikubeCA
APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false
CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:19:26.299045 1701291 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1124 09:19:26.299146 1701291 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1124 09:19:26.324900 1701291 cri.go:89] found id: ""
	I1124 09:19:26.325047 1701291 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1124 09:19:26.331898 1701291 command_runner.go:130] > /var/lib/kubelet/config.yaml
	I1124 09:19:26.331976 1701291 command_runner.go:130] > /var/lib/kubelet/kubeadm-flags.env
	I1124 09:19:26.331999 1701291 command_runner.go:130] > /var/lib/minikube/etcd:
	I1124 09:19:26.332730 1701291 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1124 09:19:26.332771 1701291 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1124 09:19:26.332851 1701291 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1124 09:19:26.340023 1701291 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1124 09:19:26.340455 1701291 kubeconfig.go:47] verify endpoint returned: get endpoint: "functional-291288" does not appear in /home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 09:19:26.340556 1701291 kubeconfig.go:62] /home/jenkins/minikube-integration/21978-1652607/kubeconfig needs updating (will repair): [kubeconfig missing "functional-291288" cluster setting kubeconfig missing "functional-291288" context setting]
	I1124 09:19:26.340827 1701291 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/kubeconfig: {Name:mk02121ae6148bede61eabf0ed4e1826024715f4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 09:19:26.341245 1701291 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 09:19:26.341410 1701291 kapi.go:59] client config for functional-291288: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.crt", KeyFile:"/home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.key", CAFile:"/home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(ni
l), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb2df0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1124 09:19:26.341966 1701291 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1124 09:19:26.341987 1701291 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1124 09:19:26.341993 1701291 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1124 09:19:26.341999 1701291 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1124 09:19:26.342005 1701291 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1124 09:19:26.342302 1701291 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1124 09:19:26.342404 1701291 cert_rotation.go:141] "Starting client certificate rotation controller" logger="tls-transport-cache"
	I1124 09:19:26.349720 1701291 kubeadm.go:635] The running cluster does not require reconfiguration: 192.168.49.2
	I1124 09:19:26.349757 1701291 kubeadm.go:602] duration metric: took 16.96677ms to restartPrimaryControlPlane
	I1124 09:19:26.349768 1701291 kubeadm.go:403] duration metric: took 50.840633ms to StartCluster
	I1124 09:19:26.349802 1701291 settings.go:142] acquiring lock: {Name:mk6c04793f5fd4f38f92abf4357247f2ccd7fc4e Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 09:19:26.349888 1701291 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 09:19:26.350548 1701291 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/kubeconfig: {Name:mk02121ae6148bede61eabf0ed4e1826024715f4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 09:19:26.350757 1701291 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1124 09:19:26.351051 1701291 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1124 09:19:26.351103 1701291 addons.go:527] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I1124 09:19:26.351171 1701291 addons.go:70] Setting storage-provisioner=true in profile "functional-291288"
	I1124 09:19:26.351184 1701291 addons.go:239] Setting addon storage-provisioner=true in "functional-291288"
	I1124 09:19:26.351210 1701291 host.go:66] Checking if "functional-291288" exists ...
	I1124 09:19:26.351260 1701291 addons.go:70] Setting default-storageclass=true in profile "functional-291288"
	I1124 09:19:26.351281 1701291 addons_storage_classes.go:34] enableOrDisableStorageClasses default-storageclass=true on "functional-291288"
	I1124 09:19:26.351591 1701291 cli_runner.go:164] Run: docker container inspect functional-291288 --format={{.State.Status}}
	I1124 09:19:26.351665 1701291 cli_runner.go:164] Run: docker container inspect functional-291288 --format={{.State.Status}}
	I1124 09:19:26.356026 1701291 out.go:179] * Verifying Kubernetes components...
	I1124 09:19:26.358753 1701291 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1124 09:19:26.386934 1701291 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 09:19:26.387124 1701291 kapi.go:59] client config for functional-291288: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.crt", KeyFile:"/home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.key", CAFile:"/home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(ni
l), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb2df0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1124 09:19:26.387397 1701291 addons.go:239] Setting addon default-storageclass=true in "functional-291288"
	I1124 09:19:26.387423 1701291 host.go:66] Checking if "functional-291288" exists ...
	I1124 09:19:26.387832 1701291 cli_runner.go:164] Run: docker container inspect functional-291288 --format={{.State.Status}}
	I1124 09:19:26.389901 1701291 out.go:179]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1124 09:19:26.395008 1701291 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:26.395037 1701291 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1124 09:19:26.395101 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:26.420232 1701291 addons.go:436] installing /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:26.420253 1701291 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1124 09:19:26.420313 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:26.425570 1701291 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:19:26.456516 1701291 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:19:26.560922 1701291 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1124 09:19:26.576856 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:26.613035 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:27.382844 1701291 node_ready.go:35] waiting up to 6m0s for node "functional-291288" to be "Ready" ...
	I1124 09:19:27.383045 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:27.383222 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:27.383136 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:27.383333 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:27.383470 1701291 retry.go:31] will retry after 330.402351ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:27.383574 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:27.383622 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:27.383641 1701291 retry.go:31] will retry after 362.15201ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:27.383749 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:27.714181 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:27.746972 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:27.795758 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:27.795808 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:27.795853 1701291 retry.go:31] will retry after 486.739155ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:27.825835 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:27.825930 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:27.825968 1701291 retry.go:31] will retry after 300.110995ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:27.884058 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:27.884183 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:27.884499 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:28.126983 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:28.217006 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:28.217052 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:28.217072 1701291 retry.go:31] will retry after 300.765079ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:28.283248 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:28.347318 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:28.347417 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:28.347441 1701291 retry.go:31] will retry after 303.335388ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:28.383528 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:28.383642 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:28.383982 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:28.518292 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:28.580592 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:28.580640 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:28.580660 1701291 retry.go:31] will retry after 1.066338993s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:28.651903 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:28.713844 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:28.713897 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:28.713918 1701291 retry.go:31] will retry after 1.056665241s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:28.884118 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:28.884220 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:28.884569 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:29.383298 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:29.383424 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:29.383770 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:29.383848 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:29.647985 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:29.716805 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:29.720169 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:29.720200 1701291 retry.go:31] will retry after 944.131514ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:29.771443 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:29.838798 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:29.842880 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:29.842911 1701291 retry.go:31] will retry after 1.275018698s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:29.883132 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:29.883209 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:29.883509 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:30.383649 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:30.383776 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:30.384127 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:30.664505 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:30.720036 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:30.723467 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:30.723535 1701291 retry.go:31] will retry after 2.138623105s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:30.883817 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:30.883887 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:30.884224 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:31.118957 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:31.199799 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:31.199840 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:31.199882 1701291 retry.go:31] will retry after 2.182241097s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:31.383252 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:31.383376 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:31.383741 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:31.883141 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:31.883218 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:31.883484 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:31.883535 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:32.383203 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:32.383282 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:32.383615 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:32.863283 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:32.883678 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:32.883784 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:32.884128 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:32.923038 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:32.923079 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:32.923098 1701291 retry.go:31] will retry after 3.572603171s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:33.382308 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:33.383761 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:33.383826 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:33.384119 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:33.453074 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:33.453119 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:33.453141 1701291 retry.go:31] will retry after 3.109489242s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:33.883699 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:33.883773 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:33.884102 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:33.884157 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:34.383924 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:34.383999 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:34.384345 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:34.883591 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:34.883679 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:34.883980 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:35.383814 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:35.383894 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:35.384241 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:35.884036 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:35.884171 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:35.884537 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:35.884594 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:36.383696 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:36.383766 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:36.384025 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:36.496437 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:36.551663 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:36.555562 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:36.555638 1701291 retry.go:31] will retry after 5.073494199s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:36.562783 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:36.628271 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:36.628317 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:36.628342 1701291 retry.go:31] will retry after 5.770336946s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:36.883845 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:36.883918 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:36.884243 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:37.384077 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:37.384153 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:37.384472 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:37.883154 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:37.883226 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:37.883536 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:38.383157 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:38.383232 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:38.383563 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:38.383620 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:38.883187 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:38.883316 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:38.883616 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:39.383889 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:39.383969 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:39.384246 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:39.884093 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:39.884182 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:39.884521 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:40.383195 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:40.383272 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:40.383608 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:40.383663 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:40.883068 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:40.883144 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:40.883421 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:41.383174 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:41.383248 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:41.383670 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:41.630088 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:41.704671 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:41.704728 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:41.704747 1701291 retry.go:31] will retry after 8.448093656s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:41.884076 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:41.884161 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:41.884479 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:42.383803 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:42.383879 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:42.384141 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:42.384184 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:42.399541 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:42.476011 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:42.476071 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:42.476093 1701291 retry.go:31] will retry after 9.502945959s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:42.883588 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:42.883671 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:42.884026 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:43.383828 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:43.383907 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:43.384181 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:43.883670 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:43.883743 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:43.884060 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:44.383696 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:44.383771 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:44.384127 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:44.384222 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:44.883976 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:44.884053 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:44.884413 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:45.383648 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:45.383811 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:45.384197 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:45.883998 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:45.884089 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:45.884467 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:46.383603 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:46.383678 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:46.384022 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:46.883619 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:46.883699 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:46.883981 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:46.884038 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:47.383777 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:47.383855 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:47.384200 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:47.883911 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:47.884016 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:47.884384 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:48.383668 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:48.383739 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:48.384087 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:48.883874 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:48.883952 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:48.884283 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:48.884343 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:49.383082 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:49.383173 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:49.383540 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:49.883082 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:49.883151 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:49.883411 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:50.153986 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:50.216789 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:50.216837 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:50.216857 1701291 retry.go:31] will retry after 12.027560843s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:50.383117 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:50.383226 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:50.383583 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:50.883287 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:50.883368 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:50.883726 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:51.383622 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:51.383710 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:51.384038 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:51.384100 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:51.883690 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:51.883770 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:51.884105 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:51.979351 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:52.048232 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:52.048287 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:52.048307 1701291 retry.go:31] will retry after 5.922680138s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:52.383846 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:52.383926 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:52.384262 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:52.883642 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:52.883714 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:52.884029 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:53.383844 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:53.383917 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:53.384249 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:53.384309 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:53.884029 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:53.884108 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:53.884493 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:54.383680 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:54.383755 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:54.384008 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:54.883852 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:54.883926 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:54.884262 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:55.384060 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:55.384132 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:55.384467 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:55.384528 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:55.883800 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:55.883874 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:55.884153 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:56.383266 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:56.383344 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:56.383682 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:56.883176 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:56.883258 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:56.883607 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:57.383853 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:57.383936 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:57.384284 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:57.884078 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:57.884157 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:57.884542 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:57.884608 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:57.972042 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:58.032393 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:58.036131 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:58.036169 1701291 retry.go:31] will retry after 15.323516146s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:58.383700 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:58.383776 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:58.384074 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:58.883637 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:58.883711 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:58.883992 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:59.383767 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:59.383847 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:59.384170 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:59.883954 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:59.884029 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:59.884364 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:00.386702 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:00.386929 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:00.387350 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:00.387652 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:00.883998 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:00.884089 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:00.884461 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:01.383250 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:01.383328 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:01.383704 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:01.883996 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:01.884068 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:01.884357 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:02.244687 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:20:02.303604 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:20:02.306952 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:02.306992 1701291 retry.go:31] will retry after 20.630907774s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:02.383202 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:02.383281 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:02.383599 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:02.883330 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:02.883410 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:02.883745 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:02.883800 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:03.383311 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:03.383386 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:03.383651 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:03.883196 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:03.883275 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:03.883594 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:04.383175 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:04.383259 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:04.383568 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:04.883120 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:04.883192 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:04.883478 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:05.383217 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:05.383295 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:05.383624 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:05.383680 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:05.883368 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:05.883446 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:05.883773 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:06.383723 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:06.383806 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:06.384068 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:06.883869 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:06.883945 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:06.884264 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:07.384063 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:07.384138 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:07.384462 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:07.384526 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:07.883109 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:07.883188 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:07.883446 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:08.383152 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:08.383224 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:08.383566 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:08.883199 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:08.883279 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:08.883603 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:09.383126 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:09.383212 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:09.383470 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:09.883162 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:09.883254 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:09.883549 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:09.883599 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:10.383190 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:10.383264 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:10.383586 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:10.883817 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:10.883892 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:10.884145 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:11.383273 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:11.383344 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:11.383622 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:11.883313 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:11.883389 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:11.883749 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:11.883806 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:12.383448 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:12.383520 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:12.383791 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:12.883177 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:12.883257 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:12.883572 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:13.360275 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:20:13.383805 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:13.383886 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:13.384154 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:13.423794 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:20:13.423847 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:13.423866 1701291 retry.go:31] will retry after 19.725114159s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:13.884034 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:13.884124 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:13.884430 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:13.884481 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:14.383158 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:14.383258 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:14.383624 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:14.883202 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:14.883284 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:14.883644 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:15.383356 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:15.383435 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:15.383734 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:15.883472 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:15.883549 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:15.883909 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:16.384044 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:16.384118 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:16.384464 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:16.384554 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:16.883205 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:16.883292 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:16.883609 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:17.383212 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:17.383289 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:17.383587 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:17.883207 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:17.883289 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:17.883594 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:18.383758 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:18.383840 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:18.384110 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:18.883984 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:18.884085 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:18.884539 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:18.884620 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:19.383195 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:19.383308 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:19.383679 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:19.883124 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:19.883204 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:19.883474 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:20.383183 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:20.383264 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:20.383612 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:20.883327 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:20.883410 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:20.883750 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:21.383113 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:21.383189 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:21.383447 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:21.383491 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:21.883186 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:21.883258 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:21.883619 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:22.383192 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:22.383277 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:22.383650 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:22.883350 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:22.883422 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:22.883692 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:22.939045 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:20:23.002892 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:20:23.002941 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:23.002963 1701291 retry.go:31] will retry after 24.365576381s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:23.384046 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:23.384125 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:23.384460 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:23.384522 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:23.883216 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:23.883293 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:23.883634 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:24.383833 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:24.383929 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:24.384212 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:24.884088 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:24.884168 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:24.884519 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:25.383227 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:25.383307 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:25.383654 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:25.883912 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:25.883982 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:25.884337 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:25.884396 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:26.383528 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:26.383619 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:26.383952 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:26.883735 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:26.883810 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:26.884149 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:27.383645 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:27.383725 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:27.384079 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:27.883693 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:27.883792 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:27.884080 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:28.383869 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:28.383941 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:28.384276 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:28.384333 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:28.883621 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:28.883696 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:28.884021 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:29.383693 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:29.383768 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:29.384125 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:29.883838 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:29.883920 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:29.884279 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:30.383628 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:30.383705 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:30.383961 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:30.883414 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:30.883492 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:30.883837 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:30.883893 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:31.383689 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:31.383767 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:31.384087 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:31.883629 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:31.883699 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:31.883964 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:32.383819 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:32.383897 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:32.384254 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:32.884067 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:32.884145 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:32.884453 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:32.884504 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:33.149949 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:20:33.204697 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:20:33.208037 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:33.208070 1701291 retry.go:31] will retry after 22.392696015s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:33.383469 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:33.383538 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:33.383796 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:33.883550 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:33.883634 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:33.883947 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:34.383737 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:34.383811 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:34.384171 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:34.883654 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:34.883734 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:34.884066 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:35.383856 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:35.383928 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:35.384271 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:35.384326 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:35.883926 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:35.884005 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:35.884370 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:36.383314 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:36.383384 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:36.383644 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:36.883149 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:36.883225 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:36.883565 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:37.383275 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:37.383359 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:37.383702 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:37.883387 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:37.883466 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:37.883722 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:37.883762 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:38.383178 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:38.383252 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:38.383603 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:38.883170 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:38.883244 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:38.883650 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:39.383205 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:39.383274 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:39.383534 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:39.883192 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:39.883267 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:39.883632 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:40.383376 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:40.383463 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:40.383839 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:40.383896 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:40.883093 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:40.883170 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:40.883479 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:41.383194 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:41.383276 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:41.383635 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:41.883337 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:41.883422 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:41.883716 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:42.383386 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:42.383461 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:42.383814 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:42.883190 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:42.883266 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:42.883601 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:42.883670 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:43.383217 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:43.383293 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:43.383671 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:43.883117 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:43.883198 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:43.883473 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:44.383174 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:44.383255 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:44.383558 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:44.883289 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:44.883363 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:44.883641 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:45.383065 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:45.383134 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:45.383415 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:45.383456 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:45.883191 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:45.883274 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:45.883563 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:46.383411 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:46.383487 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:46.383849 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:46.883402 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:46.883490 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:46.883752 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:47.369539 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:20:47.383080 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:47.383149 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:47.383440 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:47.383498 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:47.426348 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:20:47.429646 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:47.429686 1701291 retry.go:31] will retry after 22.399494886s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:47.883262 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:47.883365 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:47.883699 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:48.383121 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:48.383192 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:48.383450 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:48.883172 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:48.883246 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:48.883565 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:49.383175 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:49.383254 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:49.383619 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:49.383673 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:49.883307 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:49.883381 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:49.883648 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:50.383167 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:50.383242 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:50.383602 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:50.883305 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:50.883403 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:50.883701 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:51.383597 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:51.383671 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:51.383949 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:51.383999 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:51.883799 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:51.883891 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:51.884215 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:52.383953 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:52.384046 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:52.384337 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:52.883622 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:52.883695 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:52.883974 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:53.383750 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:53.383840 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:53.384189 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:53.384246 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:53.883872 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:53.883946 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:53.884278 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:54.383691 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:54.383768 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:54.384062 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:54.883870 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:54.883951 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:54.884279 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:55.384078 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:55.384159 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:55.384531 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:55.384594 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:55.601942 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:20:55.661064 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:20:55.665031 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:20:55.665156 1701291 out.go:285] ! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1124 09:20:55.883471 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:55.883549 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:55.883839 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:56.384006 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:56.384085 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:56.384438 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:56.883159 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:56.883256 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:56.883546 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:57.383058 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:57.383127 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:57.383401 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:57.883132 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:57.883210 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:57.883522 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:57.883572 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:58.383147 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:58.383243 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:58.383538 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:58.883654 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:58.883729 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:58.883987 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:59.383805 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:59.383888 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:59.384179 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:59.883985 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:59.884058 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:59.884355 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:59.884403 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:00.383754 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:00.383833 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:00.384151 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:00.883935 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:00.884016 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:00.884352 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:01.383267 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:01.383344 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:01.383652 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:01.883147 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:01.883222 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:01.883556 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:02.383255 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:02.383332 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:02.383663 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:02.383721 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:02.883448 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:02.883530 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:02.883895 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:03.383623 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:03.383692 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:03.383959 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:03.883727 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:03.883833 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:03.884183 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:04.383989 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:04.384068 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:04.384431 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:04.384491 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:04.883658 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:04.883737 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:04.884051 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:05.383792 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:05.383863 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:05.384221 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:05.883873 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:05.883951 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:05.884288 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:06.383270 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:06.383343 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:06.383618 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:06.883169 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:06.883268 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:06.883573 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:06.883620 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:07.383342 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:07.383427 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:07.383765 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:07.884027 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:07.884094 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:07.884425 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:08.383164 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:08.383239 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:08.383598 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:08.883308 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:08.883398 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:08.883741 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:08.883802 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:09.383098 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:09.383166 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:09.383423 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:09.830147 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:21:09.883707 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:09.883815 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:09.884234 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:09.887265 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:21:09.890761 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:21:09.890861 1701291 out.go:285] ! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1124 09:21:09.895836 1701291 out.go:179] * Enabled addons: 
	I1124 09:21:09.899594 1701291 addons.go:530] duration metric: took 1m43.548488453s for enable addons: enabled=[]
	I1124 09:21:10.383381 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:10.383468 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:10.383851 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:10.883541 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:10.883612 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:10.883871 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:10.883921 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:11.383721 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:11.383804 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:11.384146 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:11.883758 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:11.883832 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:11.884153 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:12.383650 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:12.383725 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:12.383994 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:12.883791 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:12.883869 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:12.884200 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:12.884259 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:13.384051 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:13.384130 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:13.384481 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:13.883069 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:13.883147 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:13.883443 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:14.383158 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:14.383256 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:14.383600 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:14.883308 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:14.883386 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:14.883743 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:15.383457 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:15.383524 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:15.383790 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:15.383833 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:15.883160 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:15.883235 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:15.883570 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:16.383347 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:16.383428 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:16.383759 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:16.883325 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:16.883399 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:16.883664 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:17.383191 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:17.383290 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:17.383661 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:17.883228 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:17.883306 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:17.883672 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:17.883730 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:18.383978 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:18.384061 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:18.384373 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:18.883112 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:18.883209 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:18.883544 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:19.383112 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:19.383198 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:19.383544 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:19.883660 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:19.883735 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:19.883994 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:19.884034 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:20.383840 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:20.383939 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:20.384276 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:20.884063 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:20.884139 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:20.884609 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:21.383122 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:21.383191 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:21.383474 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:21.883229 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:21.883311 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:21.883643 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:22.383182 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:22.383259 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:22.383610 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:22.383663 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:22.884007 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:22.884077 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:22.884343 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:23.384148 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:23.384238 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:23.384581 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:23.883132 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:23.883207 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:23.883554 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:24.383088 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:24.383159 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:24.383481 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:24.883179 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:24.883275 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:24.883610 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:24.883675 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:25.383186 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:25.383268 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:25.383608 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:25.883472 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:25.883586 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:25.884129 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:26.383146 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:26.383230 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:26.383577 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:26.883188 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:26.883299 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:26.883678 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:26.883758 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:27.384111 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:27.384181 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:27.384491 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:27.883092 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:27.883171 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:27.883515 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:28.383158 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:28.383240 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:28.383623 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:28.883309 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:28.883385 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:28.883717 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:29.383191 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:29.383266 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:29.383650 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:29.383702 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:29.883188 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:29.883275 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:29.883613 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:30.383126 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:30.383193 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:30.383490 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:30.883171 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:30.883254 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:30.883605 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:31.383171 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:31.383250 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:31.383601 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:31.883122 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:31.883191 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:31.883449 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:31.883489 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:32.383217 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:32.383291 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:32.383620 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:32.883183 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:32.883260 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:32.883629 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:33.383180 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:33.383262 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:33.383549 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:33.883190 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:33.883271 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:33.883638 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:33.883694 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:34.383256 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:34.383337 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:34.383680 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:34.883132 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:34.883214 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:34.883526 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:35.383172 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:35.383252 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:35.383615 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:35.883199 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:35.883282 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:35.883582 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:36.383281 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:36.383351 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:36.383609 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:36.383650 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:36.883304 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:36.883386 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:36.883706 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:37.383434 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:37.383512 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:37.383858 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:37.883555 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:37.883635 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:37.883920 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:38.383713 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:38.383800 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:38.384150 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:38.384211 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:38.884005 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:38.884085 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:38.884432 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:39.383117 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:39.383189 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:39.383470 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:39.883210 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:39.883320 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:39.883681 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:40.383192 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:40.383273 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:40.383648 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:40.883329 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:40.883413 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:40.883677 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:40.883719 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:41.383810 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:41.383891 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:41.384260 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:41.884110 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:41.884211 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:41.884610 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:42.383111 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:42.383184 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:42.383469 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:42.883146 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:42.883219 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:42.883556 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:43.383303 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:43.383390 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:43.383815 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:43.383880 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:43.884150 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:43.884225 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:43.884489 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:44.383187 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:44.383285 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:44.383631 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:44.883347 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:44.883424 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:44.883787 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:45.383143 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:45.383221 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:45.383485 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:45.883220 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:45.883291 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:45.883631 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:45.883683 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:46.383565 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:46.383643 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:46.384005 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:46.883681 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:46.883753 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:46.884095 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:47.383931 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:47.384032 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:47.384438 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:47.884098 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:47.884173 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:47.884475 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:47.884521 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:48.383141 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:48.383214 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:48.383504 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:48.883189 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:48.883295 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:48.883641 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:49.383237 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:49.383316 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:49.383651 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:49.883138 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:49.883214 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:49.883514 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:50.383163 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:50.383242 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:50.383592 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:50.383651 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:50.883194 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:50.883284 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:50.883599 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:51.383074 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:51.383155 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:51.383436 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:51.883141 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:51.883231 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:51.883582 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:52.383155 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:52.383242 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:52.383575 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:52.883252 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:52.883327 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:52.883595 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:52.883642 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:53.383321 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:53.383392 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:53.383737 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:53.883216 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:53.883293 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:53.883646 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:54.383339 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:54.383413 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:54.383688 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:54.883201 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:54.883274 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:54.883590 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:55.383288 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:55.383366 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:55.383704 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:55.383769 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:55.883435 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:55.883505 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:55.883816 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:56.384009 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:56.384088 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:56.384422 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:56.883137 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:56.883222 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:56.883558 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:57.383818 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:57.383897 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:57.384172 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:57.384212 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:57.883977 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:57.884053 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:57.884399 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:58.383153 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:58.383233 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:58.383556 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:58.883101 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:58.883177 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:58.883433 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:59.383157 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:59.383236 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:59.383565 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:59.883168 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:59.883258 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:59.883650 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:59.883705 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:00.383305 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:00.383386 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:00.383837 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:00.883166 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:00.883245 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:00.883577 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:01.383186 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:01.383267 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:01.383606 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:01.883853 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:01.883923 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:01.884206 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:01.884257 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:02.384016 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:02.384095 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:02.384455 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:02.884100 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:02.884181 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:02.884522 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:03.383131 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:03.383207 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:03.383521 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:03.883200 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:03.883287 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:03.883604 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:04.383190 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:04.383274 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:04.383643 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:04.383702 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:04.883207 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:04.883279 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:04.883551 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:05.383188 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:05.383272 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:05.383607 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:05.883177 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:05.883257 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:05.883627 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:06.383792 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:06.383879 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:06.384240 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:06.384291 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:06.884040 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:06.884120 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:06.884445 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:07.383154 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:07.383230 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:07.383566 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:07.883872 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:07.883944 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:07.884212 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:08.383967 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:08.384042 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:08.384363 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:08.384428 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:08.883105 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:08.883184 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:08.883520 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:09.383654 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:09.383727 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:09.384039 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:09.883710 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:09.883788 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:09.884141 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:10.383940 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:10.384022 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:10.384358 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:10.883643 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:10.883717 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:10.883979 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:10.884026 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:11.384043 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:11.384119 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:11.384476 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:11.884107 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:11.884182 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:11.884497 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:12.383080 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:12.383156 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:12.383420 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:12.883114 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:12.883197 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:12.883546 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:13.383148 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:13.383235 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:13.383567 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:13.383626 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:13.883128 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:13.883206 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:13.883519 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:14.383161 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:14.383237 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:14.383589 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:14.883306 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:14.883385 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:14.883735 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:15.384005 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:15.384082 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:15.384357 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:15.384407 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:15.883074 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:15.883147 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:15.883531 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:16.383351 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:16.383433 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:16.383810 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:16.883361 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:16.883437 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:16.883741 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:17.383154 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:17.383240 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:17.383580 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:17.883164 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:17.883241 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:17.883543 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:17.883590 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:18.383110 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:18.383195 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:18.383512 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:18.883210 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:18.883284 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:18.883632 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:19.383181 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:19.383265 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:19.383589 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:19.883083 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:19.883153 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:19.883418 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:20.383720 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:20.383806 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:20.384138 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:20.384189 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:20.883895 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:20.883977 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:20.884383 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:21.383110 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:21.383179 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:21.383449 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:21.883148 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:21.883222 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:21.883554 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:22.383181 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:22.383267 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:22.383745 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:22.883438 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:22.883512 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:22.883827 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:22.883878 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:23.383219 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:23.383300 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:23.383650 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:23.883352 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:23.883439 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:23.883739 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:24.383094 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:24.383172 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:24.383441 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:24.883170 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:24.883246 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:24.883573 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:25.383180 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:25.383258 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:25.383557 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:25.383602 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:25.883124 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:25.883200 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:25.883530 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:26.383418 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:26.383502 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:26.383820 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:26.883156 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:26.883232 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:26.883574 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:27.383253 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:27.383325 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:27.383640 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:27.383695 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:27.883229 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:27.883308 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:27.883663 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:28.383352 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:28.383428 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:28.383771 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:28.883152 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:28.883224 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:28.883533 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:29.383259 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:29.383346 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:29.383718 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:29.383781 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:29.883468 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:29.883551 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:29.883860 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:30.383098 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:30.383174 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:30.383431 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:30.883167 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:30.883258 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:30.883626 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:31.383185 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:31.383269 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:31.383600 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:31.883135 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:31.883200 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:31.883477 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:31.883524 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:32.383251 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:32.383334 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:32.383667 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:32.883186 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:32.883294 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:32.883590 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:33.383124 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:33.383196 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:33.383537 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:33.883238 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:33.883319 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:33.883668 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:33.883725 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:34.383411 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:34.383500 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:34.383842 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:34.883113 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:34.883201 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:34.883459 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:35.383178 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:35.383251 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:35.383573 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:35.883163 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:35.883245 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:35.883570 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:36.383743 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:36.383821 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:36.384077 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:36.384116 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:36.883871 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:36.883954 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:36.884285 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:37.384043 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:37.384116 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:37.384446 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:37.883126 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:37.883195 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:37.883464 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:38.383151 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:38.383233 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:38.383571 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:38.883272 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:38.883352 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:38.883655 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:38.883702 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:39.383349 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:39.383416 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:39.383686 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:39.883194 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:39.883268 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:39.883616 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:40.383355 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:40.383439 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:40.383825 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:40.883050 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:40.883119 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:40.883381 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:41.383178 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:41.383263 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:41.383602 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:41.383659 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:41.883334 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:41.883418 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:41.883737 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:42.383098 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:42.383164 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:42.383505 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:42.883181 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:42.883256 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:42.883607 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:43.383328 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:43.383407 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:43.383779 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:43.383848 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:43.883040 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:43.883108 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:43.883373 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:44.383062 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:44.383137 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:44.383488 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:44.883210 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:44.883294 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:44.883624 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:45.383177 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:45.383283 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:45.383549 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:45.883285 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:45.883371 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:45.883679 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:45.883730 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:46.383910 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:46.383988 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:46.384338 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:46.883634 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:46.883708 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:46.883972 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:47.383794 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:47.383890 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:47.384333 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:47.883084 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:47.883172 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:47.883512 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:48.383207 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:48.383278 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:48.383553 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:48.383599 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:48.883146 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:48.883219 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:48.883545 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:49.383190 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:49.383267 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:49.383618 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:49.883863 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:49.883935 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:49.884201 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:50.384017 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:50.384095 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:50.384461 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:50.384517 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:50.883189 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:50.883269 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:50.883636 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:51.383067 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:51.383134 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:51.383393 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:51.883095 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:51.883170 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:51.883486 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:52.383088 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:52.383168 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:52.383503 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:52.883649 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:52.883715 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:52.883972 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:52.884013 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:53.383510 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:53.383586 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:53.383942 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:53.883728 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:53.883810 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:53.884186 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:54.383720 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:54.383800 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:54.384075 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:54.883881 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:54.883959 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:54.884315 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:54.884374 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:55.383090 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:55.383174 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:55.383511 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:55.883189 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:55.883267 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:55.883536 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:56.383980 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:56.384072 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:56.384430 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:56.883181 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:56.883264 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:56.883592 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:57.383208 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:57.383283 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:57.383562 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:57.383632 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:57.883178 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:57.883262 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:57.883557 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:58.383270 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:58.383366 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:58.383681 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:58.883065 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:58.883142 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:58.883409 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:59.383139 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:59.383281 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:59.383599 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:59.883295 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:59.883368 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:59.883709 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:59.883767 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:00.383455 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:00.383535 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:00.383834 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:00.883728 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:00.883804 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:00.884143 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:01.383926 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:01.384011 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:01.384371 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:01.883658 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:01.883732 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:01.884049 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:01.884099 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:02.383854 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:02.383936 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:02.384276 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:02.883965 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:02.884044 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:02.884421 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:03.383112 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:03.383191 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:03.383460 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:03.883214 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:03.883310 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:03.883701 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:04.383439 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:04.383520 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:04.383845 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:04.383903 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:04.883132 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:04.883203 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:04.883548 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:05.383170 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:05.383248 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:05.383591 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:05.883308 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:05.883386 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:05.883685 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:06.383882 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:06.383959 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:06.384277 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:06.384350 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:06.884102 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:06.884178 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:06.884513 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:07.383085 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:07.383184 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:07.383543 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:07.883883 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:07.883956 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:07.884221 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:08.384050 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:08.384123 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:08.384452 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:08.384509 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:08.883183 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:08.883259 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:08.883584 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:09.383246 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:09.383318 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:09.383640 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:09.883219 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:09.883299 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:09.883648 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:10.383378 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:10.383458 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:10.383753 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:10.883317 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:10.883388 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:10.883680 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:10.883723 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:11.383702 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:11.383803 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:11.384131 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:11.883721 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:11.883799 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:11.884129 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:12.383663 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:12.383738 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:12.384067 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:12.883854 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:12.883940 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:12.884274 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:12.884334 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:13.384116 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:13.384195 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:13.384538 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:13.883793 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:13.883869 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:13.884135 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:14.383911 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:14.383994 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:14.384297 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:14.883958 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:14.884048 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:14.884401 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:14.884456 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:15.383622 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:15.383700 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:15.383974 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:15.883699 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:15.883778 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:15.884117 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:16.384146 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:16.384226 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:16.384578 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:16.883198 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:16.883271 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:16.883565 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:17.383190 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:17.383273 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:17.383627 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:17.383682 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:17.883355 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:17.883436 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:17.883756 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:18.383116 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:18.383185 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:18.383441 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:18.883189 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:18.883268 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:18.883596 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:19.383187 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:19.383269 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:19.383630 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:19.883313 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:19.883385 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:19.883674 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:19.883725 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:20.383174 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:20.383257 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:20.383614 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:20.883340 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:20.883415 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:20.883771 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:21.383646 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:21.383720 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:21.383985 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:21.883845 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:21.883928 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:21.884313 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:21.884368 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:22.383054 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:22.383138 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:22.383471 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:22.883086 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:22.883170 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:22.883470 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:23.383158 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:23.383233 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:23.383562 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:23.883247 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:23.883321 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:23.883637 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:24.383095 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:24.383165 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:24.383431 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:24.383471 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:24.883124 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:24.883205 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:24.883534 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:25.383173 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:25.383251 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:25.383575 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:25.883126 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:25.883196 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:25.883470 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:26.383072 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:26.383157 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:26.383507 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:26.383568 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:26.883510 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:26.883587 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:26.883957 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:27.383649 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:27.383724 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:27.384102 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:27.883942 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:27.884029 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:27.884418 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:28.383174 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:28.383254 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:28.383611 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:28.383665 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:28.883114 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:28.883191 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:28.883456 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:29.383167 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:29.383248 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:29.383597 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:29.883182 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:29.883258 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:29.883579 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:30.383122 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:30.383199 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:30.383527 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:30.883137 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:30.883215 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:30.883514 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:30.883561 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:31.383185 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:31.383260 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:31.383609 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:31.883900 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:31.883970 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:31.884278 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:32.384086 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:32.384160 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:32.384455 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:32.883182 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:32.883259 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:32.883600 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:32.883657 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:33.383111 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:33.383190 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:33.383455 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:33.883174 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:33.883260 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:33.883641 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:34.383362 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:34.383442 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:34.383802 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:34.883106 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:34.883183 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:34.883439 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:35.383143 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:35.383220 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:35.383551 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:35.383604 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:35.883151 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:35.883229 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:35.883562 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:36.383293 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:36.383366 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:36.383619 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:36.883173 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:36.883255 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:36.883580 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:37.383161 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:37.383237 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:37.383584 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:37.383635 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:37.883107 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:37.883182 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:37.883504 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:38.383163 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:38.383248 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:38.383593 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:38.883178 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:38.883257 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:38.883615 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:39.383868 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:39.383940 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:39.384210 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:39.384251 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:39.883999 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:39.884075 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:39.884422 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:40.383152 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:40.383231 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:40.383560 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:40.883278 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:40.883355 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:40.883619 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:41.383462 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:41.383549 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:41.383883 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:41.883473 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:41.883550 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:41.883893 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:41.883952 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:42.383654 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:42.383728 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:42.384013 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:42.883799 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:42.883875 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:42.884236 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:43.384072 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:43.384157 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:43.384486 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:43.883149 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:43.883222 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:43.883524 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:44.383233 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:44.383315 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:44.383652 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:44.383713 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:44.883155 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:44.883243 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:44.883579 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:45.383126 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:45.383203 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:45.383524 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:45.883188 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:45.883264 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:45.883628 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:46.383346 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:46.383429 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:46.383765 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:46.383819 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:46.883878 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:46.883951 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:46.884224 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:47.384060 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:47.384136 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:47.384469 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:47.883170 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:47.883249 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:47.883589 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:48.383128 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:48.383211 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:48.383474 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:48.883184 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:48.883264 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:48.883602 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:48.883663 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:49.383164 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:49.383241 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:49.383569 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:49.883290 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:49.883367 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:49.883671 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:50.383185 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:50.383268 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:50.383648 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:50.883431 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:50.883514 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:50.883850 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:50.883909 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:51.383652 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:51.383720 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:51.383978 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:51.883444 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:51.883523 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:51.883866 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:52.383586 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:52.383680 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:52.384026 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:52.883655 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:52.883728 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:52.884053 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:52.884105 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:53.383855 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:53.383945 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:53.384271 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:53.884101 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:53.884186 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:53.884529 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:54.383101 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:54.383176 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:54.383443 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:54.883148 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:54.883222 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:54.883575 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:55.383191 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:55.383270 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:55.383608 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:55.383664 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:55.883870 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:55.883946 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:55.884289 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:56.383256 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:56.383351 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:56.383747 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:56.883462 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:56.883538 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:56.883871 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:57.383563 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:57.383638 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:57.383899 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:57.383944 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:57.883683 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:57.883768 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:57.884147 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:58.383932 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:58.384008 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:58.384395 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:58.883091 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:58.883159 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:58.883412 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:59.383084 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:59.383166 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:59.383498 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:59.883178 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:59.883259 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:59.883595 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:59.883655 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:00.392124 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:00.392210 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:00.392556 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:00.883180 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:00.883282 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:00.883653 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:01.383190 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:01.383267 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:01.383567 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:01.883245 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:01.883313 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:01.883583 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:02.383189 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:02.383267 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:02.383605 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:02.383667 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:02.883233 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:02.883317 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:02.883620 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:03.383856 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:03.383927 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:03.384185 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:03.884056 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:03.884135 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:03.884494 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:04.383223 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:04.383311 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:04.383613 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:04.883276 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:04.883345 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:04.883599 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:04.883643 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:05.383163 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:05.383239 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:05.383541 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:05.883213 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:05.883295 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:05.883634 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:06.383304 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:06.383375 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:06.383679 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:06.883401 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:06.883483 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:06.883806 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:06.883865 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:07.383190 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:07.383271 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:07.383648 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:07.883325 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:07.883398 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:07.883710 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:08.383188 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:08.383266 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:08.383566 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:08.883267 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:08.883345 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:08.883690 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:09.384042 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:09.384118 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:09.384458 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:09.384510 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:09.883180 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:09.883253 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:09.883573 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:10.383180 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:10.383261 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:10.383583 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:10.883127 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:10.883204 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:10.883474 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:11.383167 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:11.383240 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:11.383552 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:11.883157 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:11.883234 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:11.883563 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:11.883618 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:12.383084 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:12.383153 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:12.383411 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:12.883181 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:12.883256 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:12.883591 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:13.383188 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:13.383265 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:13.383586 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:13.883123 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:13.883204 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:13.883485 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:14.383559 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:14.383638 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:14.383953 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:14.384012 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:14.883792 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:14.883868 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:14.884213 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:15.383592 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:15.383667 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:15.383925 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:15.883767 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:15.883843 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:15.884202 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:16.383348 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:16.383420 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:16.383758 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:16.883457 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:16.883538 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:16.883795 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:16.883837 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:17.383161 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:17.383238 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:17.383573 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:17.883186 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:17.883271 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:17.883611 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:18.383302 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:18.383377 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:18.383637 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:18.883191 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:18.883269 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:18.883626 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:19.383147 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:19.383224 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:19.383554 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:19.383616 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:19.883116 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:19.883185 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:19.883449 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:20.383135 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:20.383213 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:20.383531 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:20.883180 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:20.883260 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:20.883559 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:21.383124 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:21.383200 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:21.383460 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:21.883132 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:21.883213 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:21.883553 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:21.883608 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:22.383153 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:22.383241 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:22.383543 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:22.883110 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:22.883179 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:22.883439 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:23.383165 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:23.383246 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:23.383640 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:23.883372 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:23.883448 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:23.883789 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:23.883846 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:24.383112 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:24.383188 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:24.383507 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:24.883200 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:24.883275 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:24.883561 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:25.383261 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:25.383336 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:25.383674 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:25.883358 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:25.883437 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:25.883749 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:26.383290 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:26.383368 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:26.383724 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:26.383783 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:26.883478 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:26.883555 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:26.883888 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:27.383604 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:27.383677 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:27.383939 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:27.883757 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:27.883845 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:27.884167 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:28.383852 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:28.383928 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:28.384269 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:28.384325 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:28.883626 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:28.883692 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:28.883958 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:29.383717 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:29.383796 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:29.384139 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:29.883960 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:29.884036 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:29.884369 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:30.383625 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:30.383694 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:30.383980 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:30.883744 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:30.883816 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:30.884150 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:30.884205 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:31.383977 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:31.384060 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:31.384393 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:31.883624 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:31.883716 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:31.883977 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:32.383741 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:32.383814 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:32.384155 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:32.883968 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:32.884055 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:32.884386 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:32.884443 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:33.383735 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:33.383805 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:33.384072 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:33.883915 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:33.883991 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:33.884369 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:34.383120 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:34.383204 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:34.383566 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:34.883846 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:34.883924 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:34.884224 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:35.383982 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:35.384056 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:35.384427 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:35.384483 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:35.884112 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:35.884192 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:35.884530 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:36.383425 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:36.383499 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:36.383766 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:36.883482 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:36.883565 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:36.883947 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:37.383742 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:37.383819 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:37.384158 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:37.883707 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:37.883774 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:37.884034 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:37.884074 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:38.383850 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:38.383958 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:38.384324 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:38.883075 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:38.883152 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:38.883501 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:39.383112 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:39.383186 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:39.383448 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:39.883223 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:39.883319 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:39.883638 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:40.383373 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:40.383445 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:40.383734 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:40.383787 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:40.883050 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:40.883126 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:40.883428 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:41.383217 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:41.383293 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:41.383634 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:41.883211 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:41.883294 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:41.883578 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:42.383069 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:42.383136 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:42.383390 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:42.883177 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:42.883268 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:42.883564 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:42.883610 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:43.383316 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:43.383402 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:43.383752 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:43.884076 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:43.884150 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:43.884466 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:44.383187 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:44.383282 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:44.383645 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:44.883389 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:44.883464 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:44.883804 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:44.883864 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:45.383117 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:45.383195 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:45.383502 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:45.883172 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:45.883255 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:45.883601 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:46.383362 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:46.383444 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:46.383798 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:46.883132 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:46.883204 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:46.883525 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:47.383245 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:47.383343 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:47.383724 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:47.383787 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:47.883322 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:47.883396 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:47.883705 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:48.383414 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:48.383490 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:48.383778 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:48.883192 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:48.883270 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:48.883613 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:49.383457 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:49.383533 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:49.383864 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:49.383922 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:49.883061 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:49.883134 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:49.883396 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:50.383133 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:50.383215 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:50.383592 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:50.883345 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:50.883424 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:50.883767 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:51.383609 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:51.383687 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:51.383946 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:51.383994 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:51.883714 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:51.883789 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:51.884128 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:52.383943 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:52.384028 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:52.384399 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:52.883710 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:52.883786 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:52.884049 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:53.383826 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:53.383902 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:53.384299 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:53.384353 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:53.883075 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:53.883154 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:53.883549 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:54.383241 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:54.383316 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:54.383579 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:54.883201 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:54.883281 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:54.883627 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:55.383208 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:55.383284 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:55.383613 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:55.883300 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:55.883372 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:55.883626 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:55.883666 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:56.383898 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:56.383987 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:56.384342 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:56.883076 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:56.883152 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:56.883529 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:57.383840 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:57.383919 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:57.384396 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:57.883127 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:57.883220 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:57.883601 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:58.383185 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:58.383263 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:58.383601 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:58.383658 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:58.883103 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:58.883174 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:58.883430 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:59.383161 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:59.383239 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:59.383528 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:59.883235 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:59.883319 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:59.883655 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:00.383085 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:00.383175 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:00.383480 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:00.883287 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:00.883398 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:00.883768 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:00.883828 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:01.383727 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:01.383809 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:01.384138 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:01.883728 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:01.883797 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:01.884120 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:02.383919 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:02.383991 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:02.384291 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:02.884049 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:02.884120 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:02.884420 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:02.884485 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:03.383819 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:03.383888 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:03.384209 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:03.884004 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:03.884091 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:03.884451 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:04.384095 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:04.384179 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:04.384501 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:04.883234 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:04.883303 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:04.883584 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:05.383144 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:05.383220 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:05.383542 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:05.383601 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:05.883200 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:05.883285 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:05.883658 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:06.383258 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:06.383334 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:06.383660 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:06.883258 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:06.883336 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:06.883680 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:07.383404 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:07.383485 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:07.383858 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:07.383913 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:07.883619 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:07.883699 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:07.883964 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:08.383741 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:08.383818 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:08.384168 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:08.883989 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:08.884068 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:08.884393 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:09.384092 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:09.384163 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:09.384427 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:09.384467 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:09.883175 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:09.883251 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:09.883564 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:10.383182 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:10.383261 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:10.383593 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:10.883126 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:10.883201 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:10.883461 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:11.383179 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:11.383273 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:11.383596 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:11.883182 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:11.883257 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:11.883609 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:11.883665 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:12.383318 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:12.383399 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:12.383715 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:12.883177 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:12.883251 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:12.883592 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:13.383299 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:13.383377 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:13.383726 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:13.883393 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:13.883461 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:13.883721 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:13.883763 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:14.383180 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:14.383258 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:14.383605 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:14.883184 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:14.883272 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:14.883660 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:15.383351 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:15.383434 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:15.383700 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:15.883201 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:15.883305 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:15.883711 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:16.383360 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:16.383441 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:16.383809 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:16.383867 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:16.883068 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:16.883136 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:16.883406 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:17.383093 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:17.383175 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:17.383513 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:17.883239 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:17.883322 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:17.883695 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:18.383395 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:18.383464 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:18.383742 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:18.883187 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:18.883264 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:18.883645 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:18.883700 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:19.383197 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:19.383275 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:19.383625 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:19.883335 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:19.883406 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:19.883791 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:20.383200 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:20.383305 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:20.383704 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:20.883416 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:20.883493 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:20.883891 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:20.883947 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:21.383661 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:21.383731 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:21.383987 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:21.883737 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:21.883815 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:21.884385 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:22.383106 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:22.383198 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:22.383565 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:22.883149 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:22.883216 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:22.883512 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:23.383192 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:23.383276 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:23.383626 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:23.383680 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:23.883354 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:23.883454 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:23.883802 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:24.383131 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:24.383205 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:24.383521 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:24.883214 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:24.883298 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:24.883675 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:25.383244 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:25.383317 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:25.383636 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:25.883064 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:25.883143 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:25.883420 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:25.883474 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:26.383164 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:26.383254 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:26.383617 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:26.883333 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:26.883410 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:26.883740 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:27.383124 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:27.383182 1701291 node_ready.go:38] duration metric: took 6m0.000242478s for node "functional-291288" to be "Ready" ...
	I1124 09:25:27.386338 1701291 out.go:203] 
	W1124 09:25:27.389204 1701291 out.go:285] X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	W1124 09:25:27.389224 1701291 out.go:285] * 
	W1124 09:25:27.391374 1701291 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1124 09:25:27.394404 1701291 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.924951340Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.924967094Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.925052084Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.925155182Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.925234518Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.925307536Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.925372423Z" level=info msg="runtime interface created"
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.925429064Z" level=info msg="created NRI interface"
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.925491768Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.925597820Z" level=info msg="Connect containerd service"
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.926112073Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this"
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.927754707Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config"
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.937734275Z" level=info msg="Start subscribing containerd event"
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.937812454Z" level=info msg="Start recovering state"
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.938047057Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.938153774Z" level=info msg=serving... address=/run/containerd/containerd.sock
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.961906907Z" level=info msg="Start event monitor"
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.962097555Z" level=info msg="Start cni network conf syncer for default"
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.962159619Z" level=info msg="Start streaming server"
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.962227353Z" level=info msg="Registered namespace \"k8s.io\" with NRI"
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.962282123Z" level=info msg="runtime interface starting up..."
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.962355650Z" level=info msg="starting plugins..."
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.962422760Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Nov 24 09:19:23 functional-291288 systemd[1]: Started containerd.service - containerd container runtime.
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.964372736Z" level=info msg="containerd successfully booted in 0.061188s"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:25:29.118292    9078 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:25:29.118716    9078 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:25:29.120157    9078 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:25:29.120464    9078 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:25:29.121880    9078 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Nov24 08:20] overlayfs: idmapped layers are currently not supported
	[Nov24 08:21] overlayfs: idmapped layers are currently not supported
	[Nov24 08:22] overlayfs: idmapped layers are currently not supported
	[Nov24 08:23] overlayfs: idmapped layers are currently not supported
	[Nov24 08:24] overlayfs: idmapped layers are currently not supported
	[ +19.566509] overlayfs: idmapped layers are currently not supported
	[Nov24 08:25] overlayfs: idmapped layers are currently not supported
	[ +35.934095] overlayfs: idmapped layers are currently not supported
	[Nov24 08:27] overlayfs: idmapped layers are currently not supported
	[Nov24 08:28] overlayfs: idmapped layers are currently not supported
	[Nov24 08:29] overlayfs: idmapped layers are currently not supported
	[Nov24 08:30] overlayfs: idmapped layers are currently not supported
	[Nov24 08:31] overlayfs: idmapped layers are currently not supported
	[ +44.505180] overlayfs: idmapped layers are currently not supported
	[Nov24 08:32] overlayfs: idmapped layers are currently not supported
	[Nov24 08:33] overlayfs: idmapped layers are currently not supported
	[Nov24 08:34] overlayfs: idmapped layers are currently not supported
	[ +21.137877] overlayfs: idmapped layers are currently not supported
	[ +28.724175] overlayfs: idmapped layers are currently not supported
	[Nov24 08:36] overlayfs: idmapped layers are currently not supported
	[Nov24 08:37] overlayfs: idmapped layers are currently not supported
	[Nov24 08:38] overlayfs: idmapped layers are currently not supported
	[  +4.063834] overlayfs: idmapped layers are currently not supported
	[Nov24 08:40] overlayfs: idmapped layers are currently not supported
	[Nov24 08:42] kauditd_printk_skb: 8 callbacks suppressed
	
	
	==> kernel <==
	 09:25:29 up  8:07,  0 user,  load average: 0.34, 0.26, 0.48
	Linux functional-291288 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Nov 24 09:25:25 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:25:26 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 809.
	Nov 24 09:25:26 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:25:26 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:25:26 functional-291288 kubelet[8964]: E1124 09:25:26.672275    8964 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:25:26 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:25:26 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:25:27 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 810.
	Nov 24 09:25:27 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:25:27 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:25:27 functional-291288 kubelet[8969]: E1124 09:25:27.465095    8969 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:25:27 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:25:27 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:25:28 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 811.
	Nov 24 09:25:28 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:25:28 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:25:28 functional-291288 kubelet[8982]: E1124 09:25:28.185864    8982 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:25:28 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:25:28 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:25:28 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 812.
	Nov 24 09:25:28 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:25:28 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:25:28 functional-291288 kubelet[9028]: E1124 09:25:28.944507    9028 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:25:28 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:25:28 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-291288 -n functional-291288
helpers_test.go:262: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-291288 -n functional-291288: exit status 2 (363.120113ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:262: status error: exit status 2 (may be ok)
helpers_test.go:264: "functional-291288" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/SoftStart (369.25s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/KubectlGetPods (2.22s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/KubectlGetPods
functional_test.go:711: (dbg) Run:  kubectl --context functional-291288 get po -A
functional_test.go:711: (dbg) Non-zero exit: kubectl --context functional-291288 get po -A: exit status 1 (62.16979ms)

                                                
                                                
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:713: failed to get kubectl pods: args "kubectl --context functional-291288 get po -A" : exit status 1
functional_test.go:717: expected stderr to be empty but got *"The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?\n"*: args "kubectl --context functional-291288 get po -A"
functional_test.go:720: expected stdout to include *kube-system* but got *""*. args: "kubectl --context functional-291288 get po -A"
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/KubectlGetPods]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/KubectlGetPods]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect functional-291288
helpers_test.go:243: (dbg) docker inspect functional-291288:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52",
	        "Created": "2025-11-24T09:10:51.896020191Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 1695240,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-11-24T09:10:51.968983407Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:572c983e466f1f784136812eef5cc59ac623db764bc7704d3676c4643993fd08",
	        "ResolvConfPath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/hostname",
	        "HostsPath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/hosts",
	        "LogPath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52-json.log",
	        "Name": "/functional-291288",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "functional-291288:/var",
	                "/lib/modules:/lib/modules:ro"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-291288",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52",
	                "LowerDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7-init/diff:/var/lib/docker/overlay2/bf294786c7ade59cb761c7bdcc27045362c3779b00a569b685443f34ade17737/diff",
	                "MergedDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7/merged",
	                "UpperDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7/diff",
	                "WorkDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "volume",
	                "Name": "functional-291288",
	                "Source": "/var/lib/docker/volumes/functional-291288/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            },
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-291288",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-291288",
	                "name.minikube.sigs.k8s.io": "functional-291288",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "09c1c2eef0dca6362dde63b4cbc372c0cfa3e4fd084b8745043d8b88925691bf",
	            "SandboxKey": "/var/run/docker/netns/09c1c2eef0dc",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34684"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34685"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34688"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34686"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34687"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-291288": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "7e:49:22:0b:f9:2c",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "1e8f91e8ad9f46b831bbb1b0589b0022d940ee9875e64a648dc80612f3ca93dc",
	                    "EndpointID": "5de5ca8ccb07584b21e6e4e30dba12e0233e8d28c3e48e705cddffe75263b337",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-291288",
	                        "70848be15fcc"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-291288 -n functional-291288
helpers_test.go:247: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-291288 -n functional-291288: exit status 2 (325.887936ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:247: status error: exit status 2 (may be ok)
helpers_test.go:252: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/KubectlGetPods FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/KubectlGetPods]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 logs -n 25
helpers_test.go:260: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/KubectlGetPods logs: 
-- stdout --
	
	==> Audit <==
	┌────────────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│    COMMAND     │                                                                          ARGS                                                                           │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├────────────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ ssh            │ functional-941011 ssh sudo umount -f /mount-9p                                                                                                          │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ mount          │ -p functional-941011 /tmp/TestFunctionalparallelMountCmdVerifyCleanup1964745822/001:/mount1 --alsologtostderr -v=1                                      │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ ssh            │ functional-941011 ssh findmnt -T /mount1                                                                                                                │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │ 24 Nov 25 09:02 UTC │
	│ mount          │ -p functional-941011 /tmp/TestFunctionalparallelMountCmdVerifyCleanup1964745822/001:/mount2 --alsologtostderr -v=1                                      │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ mount          │ -p functional-941011 /tmp/TestFunctionalparallelMountCmdVerifyCleanup1964745822/001:/mount3 --alsologtostderr -v=1                                      │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ ssh            │ functional-941011 ssh findmnt -T /mount2                                                                                                                │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │ 24 Nov 25 09:02 UTC │
	│ ssh            │ functional-941011 ssh findmnt -T /mount3                                                                                                                │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │ 24 Nov 25 09:02 UTC │
	│ mount          │ -p functional-941011 --kill=true                                                                                                                        │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ start          │ -p functional-941011 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd                                         │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ start          │ -p functional-941011 --dry-run --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd                                                   │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ start          │ -p functional-941011 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd                                         │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ dashboard      │ --url --port 36195 -p functional-941011 --alsologtostderr -v=1                                                                                          │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:02 UTC │                     │
	│ update-context │ functional-941011 update-context --alsologtostderr -v=2                                                                                                 │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ update-context │ functional-941011 update-context --alsologtostderr -v=2                                                                                                 │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ update-context │ functional-941011 update-context --alsologtostderr -v=2                                                                                                 │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ image          │ functional-941011 image ls --format short --alsologtostderr                                                                                             │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ image          │ functional-941011 image ls --format yaml --alsologtostderr                                                                                              │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ ssh            │ functional-941011 ssh pgrep buildkitd                                                                                                                   │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │                     │
	│ image          │ functional-941011 image build -t localhost/my-image:functional-941011 testdata/build --alsologtostderr                                                  │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ image          │ functional-941011 image ls                                                                                                                              │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ image          │ functional-941011 image ls --format json --alsologtostderr                                                                                              │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ image          │ functional-941011 image ls --format table --alsologtostderr                                                                                             │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ delete         │ -p functional-941011                                                                                                                                    │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:10 UTC │ 24 Nov 25 09:10 UTC │
	│ start          │ -p functional-291288 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0 │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:10 UTC │                     │
	│ start          │ -p functional-291288 --alsologtostderr -v=8                                                                                                             │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:19 UTC │                     │
	└────────────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/11/24 09:19:20
	Running on machine: ip-172-31-30-239
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1124 09:19:20.929895 1701291 out.go:360] Setting OutFile to fd 1 ...
	I1124 09:19:20.930102 1701291 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:19:20.930128 1701291 out.go:374] Setting ErrFile to fd 2...
	I1124 09:19:20.930149 1701291 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:19:20.930488 1701291 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
	I1124 09:19:20.930883 1701291 out.go:368] Setting JSON to false
	I1124 09:19:20.931751 1701291 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":28890,"bootTime":1763947071,"procs":158,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1124 09:19:20.931843 1701291 start.go:143] virtualization:  
	I1124 09:19:20.938521 1701291 out.go:179] * [functional-291288] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1124 09:19:20.941571 1701291 out.go:179]   - MINIKUBE_LOCATION=21978
	I1124 09:19:20.941660 1701291 notify.go:221] Checking for updates...
	I1124 09:19:20.947508 1701291 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1124 09:19:20.950282 1701291 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 09:19:20.953189 1701291 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21978-1652607/.minikube
	I1124 09:19:20.956068 1701291 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1124 09:19:20.958991 1701291 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1124 09:19:20.962273 1701291 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1124 09:19:20.962433 1701291 driver.go:422] Setting default libvirt URI to qemu:///system
	I1124 09:19:20.992476 1701291 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1124 09:19:20.992586 1701291 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 09:19:21.057666 1701291 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-11-24 09:19:21.047762616 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 09:19:21.057787 1701291 docker.go:319] overlay module found
	I1124 09:19:21.060830 1701291 out.go:179] * Using the docker driver based on existing profile
	I1124 09:19:21.063549 1701291 start.go:309] selected driver: docker
	I1124 09:19:21.063567 1701291 start.go:927] validating driver "docker" against &{Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disa
bleCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:19:21.063661 1701291 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1124 09:19:21.063775 1701291 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 09:19:21.121254 1701291 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-11-24 09:19:21.111151392 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 09:19:21.121789 1701291 cni.go:84] Creating CNI manager for ""
	I1124 09:19:21.121863 1701291 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1124 09:19:21.121942 1701291 start.go:353] cluster config:
	{Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local C
ontainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPa
th: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:19:21.125134 1701291 out.go:179] * Starting "functional-291288" primary control-plane node in "functional-291288" cluster
	I1124 09:19:21.127989 1701291 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1124 09:19:21.131005 1701291 out.go:179] * Pulling base image v0.0.48-1763789673-21948 ...
	I1124 09:19:21.133917 1701291 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f in local docker daemon
	I1124 09:19:21.133914 1701291 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1124 09:19:21.154192 1701291 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f in local docker daemon, skipping pull
	I1124 09:19:21.154216 1701291 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f exists in daemon, skipping load
	W1124 09:19:21.197477 1701291 preload.go:144] https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.35.0-beta.0/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	W1124 09:19:21.391690 1701291 preload.go:144] https://github.com/kubernetes-sigs/minikube-preloads/releases/download/v18/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	I1124 09:19:21.391947 1701291 profile.go:143] Saving config to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/config.json ...
	I1124 09:19:21.392070 1701291 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:19:21.392253 1701291 cache.go:243] Successfully downloaded all kic artifacts
	I1124 09:19:21.392304 1701291 start.go:360] acquireMachinesLock for functional-291288: {Name:mk85384dc057570e1f34db593d357cea738652c4 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.392403 1701291 start.go:364] duration metric: took 38.802µs to acquireMachinesLock for "functional-291288"
	I1124 09:19:21.392443 1701291 start.go:96] Skipping create...Using existing machine configuration
	I1124 09:19:21.392463 1701291 fix.go:54] fixHost starting: 
	I1124 09:19:21.392780 1701291 cli_runner.go:164] Run: docker container inspect functional-291288 --format={{.State.Status}}
	I1124 09:19:21.413220 1701291 fix.go:112] recreateIfNeeded on functional-291288: state=Running err=<nil>
	W1124 09:19:21.413254 1701291 fix.go:138] unexpected machine state, will restart: <nil>
	I1124 09:19:21.416439 1701291 out.go:252] * Updating the running docker "functional-291288" container ...
	I1124 09:19:21.416481 1701291 machine.go:94] provisionDockerMachine start ...
	I1124 09:19:21.416565 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:21.444143 1701291 main.go:143] libmachine: Using SSH client type: native
	I1124 09:19:21.444471 1701291 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34684 <nil> <nil>}
	I1124 09:19:21.444480 1701291 main.go:143] libmachine: About to run SSH command:
	hostname
	I1124 09:19:21.581815 1701291 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:19:21.598566 1701291 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-291288
	
	I1124 09:19:21.598592 1701291 ubuntu.go:182] provisioning hostname "functional-291288"
	I1124 09:19:21.598669 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:21.623443 1701291 main.go:143] libmachine: Using SSH client type: native
	I1124 09:19:21.623759 1701291 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34684 <nil> <nil>}
	I1124 09:19:21.623771 1701291 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-291288 && echo "functional-291288" | sudo tee /etc/hostname
	I1124 09:19:21.758572 1701291 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:19:21.799121 1701291 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-291288
	
	I1124 09:19:21.799200 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:21.831127 1701291 main.go:143] libmachine: Using SSH client type: native
	I1124 09:19:21.831435 1701291 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34684 <nil> <nil>}
	I1124 09:19:21.831451 1701291 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-291288' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-291288/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-291288' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1124 09:19:21.919264 1701291 cache.go:107] acquiring lock: {Name:mk22a10f0ce1f3295b61e7e76c455d0494a3e278 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.919300 1701291 cache.go:107] acquiring lock: {Name:mk1cf42e67442503a46c578224bd3cb68bf682d4 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.919365 1701291 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 exists
	I1124 09:19:21.919369 1701291 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 exists
	I1124 09:19:21.919375 1701291 cache.go:96] cache image "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0" took 74.126µs
	I1124 09:19:21.919377 1701291 cache.go:96] cache image "gcr.io/k8s-minikube/storage-provisioner:v5" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5" took 130.274µs
	I1124 09:19:21.919383 1701291 cache.go:80] save to tar file registry.k8s.io/kube-apiserver:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 succeeded
	I1124 09:19:21.919385 1701291 cache.go:80] save to tar file gcr.io/k8s-minikube/storage-provisioner:v5 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 succeeded
	I1124 09:19:21.919395 1701291 cache.go:107] acquiring lock: {Name:mkfdc49c8e68aee34cee0c9d441ae8a4dca675c6 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.919407 1701291 cache.go:107] acquiring lock: {Name:mk85f1502dbb97830776608fb729eb3605e112e6 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.919449 1701291 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 exists
	I1124 09:19:21.919433 1701291 cache.go:107] acquiring lock: {Name:mkdbf38e05e2c47c1a7a906a2236e9e7020a94c5 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.919454 1701291 cache.go:96] cache image "registry.k8s.io/pause:3.10.1" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1" took 48.764µs
	I1124 09:19:21.919460 1701291 cache.go:80] save to tar file registry.k8s.io/pause:3.10.1 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 succeeded
	I1124 09:19:21.919471 1701291 cache.go:107] acquiring lock: {Name:mk46ce3b59d7e062b3dbc8a90fe5b4231f256471 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.919266 1701291 cache.go:107] acquiring lock: {Name:mk80fdbe7cdb5bc17c2a82b4ecfd00214559a435 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.919495 1701291 cache.go:107] acquiring lock: {Name:mk726502cb84c177b2e14fee88512325761511c5 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.919506 1701291 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 exists
	I1124 09:19:21.919511 1701291 cache.go:96] cache image "registry.k8s.io/kube-proxy:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0" took 262.074µs
	I1124 09:19:21.919517 1701291 cache.go:80] save to tar file registry.k8s.io/kube-proxy:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 succeeded
	I1124 09:19:21.919425 1701291 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 exists
	I1124 09:19:21.919525 1701291 cache.go:96] cache image "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0" took 132.661µs
	I1124 09:19:21.919532 1701291 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 exists
	I1124 09:19:21.919476 1701291 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 exists
	I1124 09:19:21.919540 1701291 cache.go:96] cache image "registry.k8s.io/coredns/coredns:v1.13.1" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1" took 48.796µs
	I1124 09:19:21.919547 1701291 cache.go:80] save to tar file registry.k8s.io/coredns/coredns:v1.13.1 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 succeeded
	I1124 09:19:21.919541 1701291 cache.go:96] cache image "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0" took 109.4µs
	I1124 09:19:21.919553 1701291 cache.go:80] save to tar file registry.k8s.io/kube-scheduler:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 succeeded
	I1124 09:19:21.919533 1701291 cache.go:80] save to tar file registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 succeeded
	I1124 09:19:21.919557 1701291 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0 exists
	I1124 09:19:21.919563 1701291 cache.go:96] cache image "registry.k8s.io/etcd:3.5.24-0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0" took 93.482µs
	I1124 09:19:21.919568 1701291 cache.go:80] save to tar file registry.k8s.io/etcd:3.5.24-0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0 succeeded
	I1124 09:19:21.919582 1701291 cache.go:87] Successfully saved all images to host disk.
	I1124 09:19:21.982718 1701291 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1124 09:19:21.982799 1701291 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/21978-1652607/.minikube CaCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/21978-1652607/.minikube}
	I1124 09:19:21.982852 1701291 ubuntu.go:190] setting up certificates
	I1124 09:19:21.982880 1701291 provision.go:84] configureAuth start
	I1124 09:19:21.982954 1701291 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-291288
	I1124 09:19:22.001413 1701291 provision.go:143] copyHostCerts
	I1124 09:19:22.001464 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem
	I1124 09:19:22.001516 1701291 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem, removing ...
	I1124 09:19:22.001530 1701291 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem
	I1124 09:19:22.001614 1701291 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem (1078 bytes)
	I1124 09:19:22.001708 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem
	I1124 09:19:22.001726 1701291 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem, removing ...
	I1124 09:19:22.001731 1701291 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem
	I1124 09:19:22.001757 1701291 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem (1123 bytes)
	I1124 09:19:22.001795 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem
	I1124 09:19:22.001816 1701291 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem, removing ...
	I1124 09:19:22.001820 1701291 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem
	I1124 09:19:22.001845 1701291 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem (1679 bytes)
	I1124 09:19:22.001893 1701291 provision.go:117] generating server cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem org=jenkins.functional-291288 san=[127.0.0.1 192.168.49.2 functional-291288 localhost minikube]
	I1124 09:19:22.129571 1701291 provision.go:177] copyRemoteCerts
	I1124 09:19:22.129639 1701291 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1124 09:19:22.129681 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:22.147944 1701291 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:19:22.254207 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I1124 09:19:22.254271 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1124 09:19:22.271706 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem -> /etc/docker/server.pem
	I1124 09:19:22.271768 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1124 09:19:22.289262 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I1124 09:19:22.289325 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1124 09:19:22.306621 1701291 provision.go:87] duration metric: took 323.706379ms to configureAuth
	I1124 09:19:22.306647 1701291 ubuntu.go:206] setting minikube options for container-runtime
	I1124 09:19:22.306839 1701291 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1124 09:19:22.306847 1701291 machine.go:97] duration metric: took 890.360502ms to provisionDockerMachine
	I1124 09:19:22.306855 1701291 start.go:293] postStartSetup for "functional-291288" (driver="docker")
	I1124 09:19:22.306866 1701291 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1124 09:19:22.306912 1701291 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1124 09:19:22.306953 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:22.324012 1701291 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:19:22.434427 1701291 ssh_runner.go:195] Run: cat /etc/os-release
	I1124 09:19:22.437860 1701291 command_runner.go:130] > PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	I1124 09:19:22.437881 1701291 command_runner.go:130] > NAME="Debian GNU/Linux"
	I1124 09:19:22.437886 1701291 command_runner.go:130] > VERSION_ID="12"
	I1124 09:19:22.437890 1701291 command_runner.go:130] > VERSION="12 (bookworm)"
	I1124 09:19:22.437898 1701291 command_runner.go:130] > VERSION_CODENAME=bookworm
	I1124 09:19:22.437901 1701291 command_runner.go:130] > ID=debian
	I1124 09:19:22.437906 1701291 command_runner.go:130] > HOME_URL="https://www.debian.org/"
	I1124 09:19:22.437910 1701291 command_runner.go:130] > SUPPORT_URL="https://www.debian.org/support"
	I1124 09:19:22.437917 1701291 command_runner.go:130] > BUG_REPORT_URL="https://bugs.debian.org/"
	I1124 09:19:22.437980 1701291 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1124 09:19:22.437995 1701291 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1124 09:19:22.438006 1701291 filesync.go:126] Scanning /home/jenkins/minikube-integration/21978-1652607/.minikube/addons for local assets ...
	I1124 09:19:22.438064 1701291 filesync.go:126] Scanning /home/jenkins/minikube-integration/21978-1652607/.minikube/files for local assets ...
	I1124 09:19:22.438143 1701291 filesync.go:149] local asset: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem -> 16544672.pem in /etc/ssl/certs
	I1124 09:19:22.438150 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem -> /etc/ssl/certs/16544672.pem
	I1124 09:19:22.438232 1701291 filesync.go:149] local asset: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/test/nested/copy/1654467/hosts -> hosts in /etc/test/nested/copy/1654467
	I1124 09:19:22.438236 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/test/nested/copy/1654467/hosts -> /etc/test/nested/copy/1654467/hosts
	I1124 09:19:22.438277 1701291 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/1654467
	I1124 09:19:22.446265 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem --> /etc/ssl/certs/16544672.pem (1708 bytes)
	I1124 09:19:22.463769 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/test/nested/copy/1654467/hosts --> /etc/test/nested/copy/1654467/hosts (40 bytes)
	I1124 09:19:22.481365 1701291 start.go:296] duration metric: took 174.495413ms for postStartSetup
	I1124 09:19:22.481446 1701291 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1124 09:19:22.481495 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:22.498552 1701291 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:19:22.598952 1701291 command_runner.go:130] > 14%
	I1124 09:19:22.599551 1701291 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1124 09:19:22.604050 1701291 command_runner.go:130] > 168G
	I1124 09:19:22.604631 1701291 fix.go:56] duration metric: took 1.212164413s for fixHost
	I1124 09:19:22.604655 1701291 start.go:83] releasing machines lock for "functional-291288", held for 1.212220037s
	I1124 09:19:22.604753 1701291 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-291288
	I1124 09:19:22.621885 1701291 ssh_runner.go:195] Run: cat /version.json
	I1124 09:19:22.621944 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:22.622207 1701291 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1124 09:19:22.622270 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:22.640397 1701291 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:19:22.648463 1701291 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:19:22.746016 1701291 command_runner.go:130] > {"iso_version": "v1.37.0-1763503576-21924", "kicbase_version": "v0.0.48-1763789673-21948", "minikube_version": "v1.37.0", "commit": "2996c7ec74d570fa8ab37e6f4f8813150d0c7473"}
	I1124 09:19:22.746158 1701291 ssh_runner.go:195] Run: systemctl --version
	I1124 09:19:22.840219 1701291 command_runner.go:130] > <a href="https://github.com/kubernetes/registry.k8s.io">Temporary Redirect</a>.
	I1124 09:19:22.840264 1701291 command_runner.go:130] > systemd 252 (252.39-1~deb12u1)
	I1124 09:19:22.840285 1701291 command_runner.go:130] > +PAM +AUDIT +SELINUX +APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT +QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified
	I1124 09:19:22.840354 1701291 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I1124 09:19:22.844675 1701291 command_runner.go:130] ! stat: cannot statx '/etc/cni/net.d/*loopback.conf*': No such file or directory
	W1124 09:19:22.844725 1701291 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1124 09:19:22.844793 1701291 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1124 09:19:22.852461 1701291 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1124 09:19:22.852484 1701291 start.go:496] detecting cgroup driver to use...
	I1124 09:19:22.852517 1701291 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1124 09:19:22.852584 1701291 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1124 09:19:22.868240 1701291 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1124 09:19:22.881367 1701291 docker.go:218] disabling cri-docker service (if available) ...
	I1124 09:19:22.881470 1701291 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1124 09:19:22.896889 1701291 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1124 09:19:22.910017 1701291 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1124 09:19:23.028071 1701291 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1124 09:19:23.171419 1701291 docker.go:234] disabling docker service ...
	I1124 09:19:23.171539 1701291 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1124 09:19:23.187505 1701291 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1124 09:19:23.201405 1701291 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1124 09:19:23.324426 1701291 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1124 09:19:23.445186 1701291 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1124 09:19:23.457903 1701291 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1124 09:19:23.470553 1701291 command_runner.go:130] > runtime-endpoint: unix:///run/containerd/containerd.sock
	I1124 09:19:23.472034 1701291 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:19:23.623898 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1124 09:19:23.632988 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1124 09:19:23.641976 1701291 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1124 09:19:23.642063 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1124 09:19:23.651244 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1124 09:19:23.660198 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1124 09:19:23.668706 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1124 09:19:23.677261 1701291 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1124 09:19:23.685600 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1124 09:19:23.694593 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1124 09:19:23.703191 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1124 09:19:23.712006 1701291 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1124 09:19:23.718640 1701291 command_runner.go:130] > net.bridge.bridge-nf-call-iptables = 1
	I1124 09:19:23.719691 1701291 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1124 09:19:23.727172 1701291 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1124 09:19:23.844539 1701291 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1124 09:19:23.964625 1701291 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1124 09:19:23.964708 1701291 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1124 09:19:23.969624 1701291 command_runner.go:130] >   File: /run/containerd/containerd.sock
	I1124 09:19:23.969648 1701291 command_runner.go:130] >   Size: 0         	Blocks: 0          IO Block: 4096   socket
	I1124 09:19:23.969655 1701291 command_runner.go:130] > Device: 0,72	Inode: 1619        Links: 1
	I1124 09:19:23.969671 1701291 command_runner.go:130] > Access: (0660/srw-rw----)  Uid: (    0/    root)   Gid: (    0/    root)
	I1124 09:19:23.969685 1701291 command_runner.go:130] > Access: 2025-11-24 09:19:23.931843190 +0000
	I1124 09:19:23.969693 1701291 command_runner.go:130] > Modify: 2025-11-24 09:19:23.931843190 +0000
	I1124 09:19:23.969699 1701291 command_runner.go:130] > Change: 2025-11-24 09:19:23.931843190 +0000
	I1124 09:19:23.969707 1701291 command_runner.go:130] >  Birth: -
	I1124 09:19:23.970283 1701291 start.go:564] Will wait 60s for crictl version
	I1124 09:19:23.970345 1701291 ssh_runner.go:195] Run: which crictl
	I1124 09:19:23.973724 1701291 command_runner.go:130] > /usr/local/bin/crictl
	I1124 09:19:23.974288 1701291 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1124 09:19:23.995301 1701291 command_runner.go:130] > Version:  0.1.0
	I1124 09:19:23.995587 1701291 command_runner.go:130] > RuntimeName:  containerd
	I1124 09:19:23.995841 1701291 command_runner.go:130] > RuntimeVersion:  v2.1.5
	I1124 09:19:23.996049 1701291 command_runner.go:130] > RuntimeApiVersion:  v1
	I1124 09:19:23.998158 1701291 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.1.5
	RuntimeApiVersion:  v1
	I1124 09:19:23.998238 1701291 ssh_runner.go:195] Run: containerd --version
	I1124 09:19:24.020107 1701291 command_runner.go:130] > containerd containerd.io v2.1.5 fcd43222d6b07379a4be9786bda52438f0dd16a1
	I1124 09:19:24.020449 1701291 ssh_runner.go:195] Run: containerd --version
	I1124 09:19:24.041776 1701291 command_runner.go:130] > containerd containerd.io v2.1.5 fcd43222d6b07379a4be9786bda52438f0dd16a1
	I1124 09:19:24.047417 1701291 out.go:179] * Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	I1124 09:19:24.050497 1701291 cli_runner.go:164] Run: docker network inspect functional-291288 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1124 09:19:24.067531 1701291 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1124 09:19:24.071507 1701291 command_runner.go:130] > 192.168.49.1	host.minikube.internal
	I1124 09:19:24.071622 1701291 kubeadm.go:884] updating cluster {Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikub
eCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:fal
se CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1124 09:19:24.071797 1701291 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:19:24.253230 1701291 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:19:24.402285 1701291 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:19:24.552419 1701291 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1124 09:19:24.552515 1701291 ssh_runner.go:195] Run: sudo crictl images --output json
	I1124 09:19:24.577200 1701291 command_runner.go:130] > {
	I1124 09:19:24.577221 1701291 command_runner.go:130] >   "images":  [
	I1124 09:19:24.577226 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577235 1701291 command_runner.go:130] >       "id":  "sha256:66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51",
	I1124 09:19:24.577240 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577245 1701291 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1124 09:19:24.577248 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577252 1701291 command_runner.go:130] >       "repoDigests":  [],
	I1124 09:19:24.577256 1701291 command_runner.go:130] >       "size":  "8032639",
	I1124 09:19:24.577264 1701291 command_runner.go:130] >       "username":  "",
	I1124 09:19:24.577269 1701291 command_runner.go:130] >       "pinned":  false
	I1124 09:19:24.577272 1701291 command_runner.go:130] >     },
	I1124 09:19:24.577276 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577283 1701291 command_runner.go:130] >       "id":  "sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1124 09:19:24.577290 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577296 1701291 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1124 09:19:24.577299 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577308 1701291 command_runner.go:130] >       "repoDigests":  [],
	I1124 09:19:24.577330 1701291 command_runner.go:130] >       "size":  "21166088",
	I1124 09:19:24.577335 1701291 command_runner.go:130] >       "username":  "nonroot",
	I1124 09:19:24.577339 1701291 command_runner.go:130] >       "pinned":  false
	I1124 09:19:24.577349 1701291 command_runner.go:130] >     },
	I1124 09:19:24.577357 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577364 1701291 command_runner.go:130] >       "id":  "sha256:1211402d28f5813ed906916bfcdd0a7404c2f9048ef5bb54387a6745bc410eca",
	I1124 09:19:24.577368 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577373 1701291 command_runner.go:130] >         "registry.k8s.io/etcd:3.5.24-0"
	I1124 09:19:24.577376 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577380 1701291 command_runner.go:130] >       "repoDigests":  [],
	I1124 09:19:24.577384 1701291 command_runner.go:130] >       "size":  "21880804",
	I1124 09:19:24.577391 1701291 command_runner.go:130] >       "uid":  {
	I1124 09:19:24.577395 1701291 command_runner.go:130] >         "value":  "0"
	I1124 09:19:24.577400 1701291 command_runner.go:130] >       },
	I1124 09:19:24.577404 1701291 command_runner.go:130] >       "username":  "",
	I1124 09:19:24.577408 1701291 command_runner.go:130] >       "pinned":  false
	I1124 09:19:24.577421 1701291 command_runner.go:130] >     },
	I1124 09:19:24.577424 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577431 1701291 command_runner.go:130] >       "id":  "sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42",
	I1124 09:19:24.577434 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577443 1701291 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.5-0"
	I1124 09:19:24.577450 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577454 1701291 command_runner.go:130] >       "repoDigests":  [
	I1124 09:19:24.577461 1701291 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534"
	I1124 09:19:24.577465 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577469 1701291 command_runner.go:130] >       "size":  "21136588",
	I1124 09:19:24.577472 1701291 command_runner.go:130] >       "uid":  {
	I1124 09:19:24.577479 1701291 command_runner.go:130] >         "value":  "0"
	I1124 09:19:24.577482 1701291 command_runner.go:130] >       },
	I1124 09:19:24.577486 1701291 command_runner.go:130] >       "username":  "",
	I1124 09:19:24.577492 1701291 command_runner.go:130] >       "pinned":  false
	I1124 09:19:24.577495 1701291 command_runner.go:130] >     },
	I1124 09:19:24.577502 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577512 1701291 command_runner.go:130] >       "id":  "sha256:ccd634d9bcc36ac6235e9c86676cd3a02c06afc3788a25f1bbf39ca7d44585f4",
	I1124 09:19:24.577516 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577521 1701291 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-beta.0"
	I1124 09:19:24.577527 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577531 1701291 command_runner.go:130] >       "repoDigests":  [],
	I1124 09:19:24.577535 1701291 command_runner.go:130] >       "size":  "24676285",
	I1124 09:19:24.577538 1701291 command_runner.go:130] >       "uid":  {
	I1124 09:19:24.577541 1701291 command_runner.go:130] >         "value":  "0"
	I1124 09:19:24.577545 1701291 command_runner.go:130] >       },
	I1124 09:19:24.577550 1701291 command_runner.go:130] >       "username":  "",
	I1124 09:19:24.577556 1701291 command_runner.go:130] >       "pinned":  false
	I1124 09:19:24.577560 1701291 command_runner.go:130] >     },
	I1124 09:19:24.577563 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577569 1701291 command_runner.go:130] >       "id":  "sha256:68b5f775f18769fcb77bd8474c80bda2050163b6c66f4551f352b7381b8ca5be",
	I1124 09:19:24.577581 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577586 1701291 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0"
	I1124 09:19:24.577590 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577594 1701291 command_runner.go:130] >       "repoDigests":  [],
	I1124 09:19:24.577605 1701291 command_runner.go:130] >       "size":  "20658969",
	I1124 09:19:24.577608 1701291 command_runner.go:130] >       "uid":  {
	I1124 09:19:24.577612 1701291 command_runner.go:130] >         "value":  "0"
	I1124 09:19:24.577615 1701291 command_runner.go:130] >       },
	I1124 09:19:24.577619 1701291 command_runner.go:130] >       "username":  "",
	I1124 09:19:24.577624 1701291 command_runner.go:130] >       "pinned":  false
	I1124 09:19:24.577629 1701291 command_runner.go:130] >     },
	I1124 09:19:24.577633 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577644 1701291 command_runner.go:130] >       "id":  "sha256:404c2e12861777b763b8feaa316d36680fc68ad308a8d2f6e55f1bb981cdd904",
	I1124 09:19:24.577655 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577660 1701291 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-beta.0"
	I1124 09:19:24.577663 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577667 1701291 command_runner.go:130] >       "repoDigests":  [],
	I1124 09:19:24.577678 1701291 command_runner.go:130] >       "size":  "22428165",
	I1124 09:19:24.577686 1701291 command_runner.go:130] >       "username":  "",
	I1124 09:19:24.577692 1701291 command_runner.go:130] >       "pinned":  false
	I1124 09:19:24.577696 1701291 command_runner.go:130] >     },
	I1124 09:19:24.577706 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577712 1701291 command_runner.go:130] >       "id":  "sha256:16378741539f1be9c6e347d127537d379a6592587b09b4eb47964cb5c43a409b",
	I1124 09:19:24.577716 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577721 1701291 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-beta.0"
	I1124 09:19:24.577724 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577728 1701291 command_runner.go:130] >       "repoDigests":  [],
	I1124 09:19:24.577738 1701291 command_runner.go:130] >       "size":  "15389290",
	I1124 09:19:24.577744 1701291 command_runner.go:130] >       "uid":  {
	I1124 09:19:24.577751 1701291 command_runner.go:130] >         "value":  "0"
	I1124 09:19:24.577754 1701291 command_runner.go:130] >       },
	I1124 09:19:24.577758 1701291 command_runner.go:130] >       "username":  "",
	I1124 09:19:24.577762 1701291 command_runner.go:130] >       "pinned":  false
	I1124 09:19:24.577768 1701291 command_runner.go:130] >     },
	I1124 09:19:24.577771 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577779 1701291 command_runner.go:130] >       "id":  "sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1124 09:19:24.577786 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577791 1701291 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1124 09:19:24.577794 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577797 1701291 command_runner.go:130] >       "repoDigests":  [],
	I1124 09:19:24.577801 1701291 command_runner.go:130] >       "size":  "265458",
	I1124 09:19:24.577805 1701291 command_runner.go:130] >       "uid":  {
	I1124 09:19:24.577809 1701291 command_runner.go:130] >         "value":  "65535"
	I1124 09:19:24.577815 1701291 command_runner.go:130] >       },
	I1124 09:19:24.577819 1701291 command_runner.go:130] >       "username":  "",
	I1124 09:19:24.577824 1701291 command_runner.go:130] >       "pinned":  true
	I1124 09:19:24.577827 1701291 command_runner.go:130] >     }
	I1124 09:19:24.577831 1701291 command_runner.go:130] >   ]
	I1124 09:19:24.577842 1701291 command_runner.go:130] > }
	I1124 09:19:24.577988 1701291 containerd.go:627] all images are preloaded for containerd runtime.
	I1124 09:19:24.578000 1701291 cache_images.go:86] Images are preloaded, skipping loading
	I1124 09:19:24.578012 1701291 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-beta.0 containerd true true} ...
	I1124 09:19:24.578111 1701291 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-291288 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1124 09:19:24.578176 1701291 ssh_runner.go:195] Run: sudo crictl info
	I1124 09:19:24.601872 1701291 command_runner.go:130] > {
	I1124 09:19:24.601895 1701291 command_runner.go:130] >   "cniconfig": {
	I1124 09:19:24.601901 1701291 command_runner.go:130] >     "Networks": [
	I1124 09:19:24.601905 1701291 command_runner.go:130] >       {
	I1124 09:19:24.601909 1701291 command_runner.go:130] >         "Config": {
	I1124 09:19:24.601914 1701291 command_runner.go:130] >           "CNIVersion": "0.3.1",
	I1124 09:19:24.601919 1701291 command_runner.go:130] >           "Name": "cni-loopback",
	I1124 09:19:24.601924 1701291 command_runner.go:130] >           "Plugins": [
	I1124 09:19:24.601927 1701291 command_runner.go:130] >             {
	I1124 09:19:24.601931 1701291 command_runner.go:130] >               "Network": {
	I1124 09:19:24.601935 1701291 command_runner.go:130] >                 "ipam": {},
	I1124 09:19:24.601941 1701291 command_runner.go:130] >                 "type": "loopback"
	I1124 09:19:24.601945 1701291 command_runner.go:130] >               },
	I1124 09:19:24.601958 1701291 command_runner.go:130] >               "Source": "{\"type\":\"loopback\"}"
	I1124 09:19:24.601965 1701291 command_runner.go:130] >             }
	I1124 09:19:24.601969 1701291 command_runner.go:130] >           ],
	I1124 09:19:24.601979 1701291 command_runner.go:130] >           "Source": "{\n\"cniVersion\": \"0.3.1\",\n\"name\": \"cni-loopback\",\n\"plugins\": [{\n  \"type\": \"loopback\"\n}]\n}"
	I1124 09:19:24.601983 1701291 command_runner.go:130] >         },
	I1124 09:19:24.601991 1701291 command_runner.go:130] >         "IFName": "lo"
	I1124 09:19:24.601994 1701291 command_runner.go:130] >       }
	I1124 09:19:24.601997 1701291 command_runner.go:130] >     ],
	I1124 09:19:24.602003 1701291 command_runner.go:130] >     "PluginConfDir": "/etc/cni/net.d",
	I1124 09:19:24.602007 1701291 command_runner.go:130] >     "PluginDirs": [
	I1124 09:19:24.602014 1701291 command_runner.go:130] >       "/opt/cni/bin"
	I1124 09:19:24.602018 1701291 command_runner.go:130] >     ],
	I1124 09:19:24.602026 1701291 command_runner.go:130] >     "PluginMaxConfNum": 1,
	I1124 09:19:24.602030 1701291 command_runner.go:130] >     "Prefix": "eth"
	I1124 09:19:24.602033 1701291 command_runner.go:130] >   },
	I1124 09:19:24.602037 1701291 command_runner.go:130] >   "config": {
	I1124 09:19:24.602041 1701291 command_runner.go:130] >     "cdiSpecDirs": [
	I1124 09:19:24.602048 1701291 command_runner.go:130] >       "/etc/cdi",
	I1124 09:19:24.602051 1701291 command_runner.go:130] >       "/var/run/cdi"
	I1124 09:19:24.602055 1701291 command_runner.go:130] >     ],
	I1124 09:19:24.602069 1701291 command_runner.go:130] >     "cni": {
	I1124 09:19:24.602073 1701291 command_runner.go:130] >       "binDir": "",
	I1124 09:19:24.602076 1701291 command_runner.go:130] >       "binDirs": [
	I1124 09:19:24.602080 1701291 command_runner.go:130] >         "/opt/cni/bin"
	I1124 09:19:24.602083 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.602087 1701291 command_runner.go:130] >       "confDir": "/etc/cni/net.d",
	I1124 09:19:24.602092 1701291 command_runner.go:130] >       "confTemplate": "",
	I1124 09:19:24.602098 1701291 command_runner.go:130] >       "ipPref": "",
	I1124 09:19:24.602103 1701291 command_runner.go:130] >       "maxConfNum": 1,
	I1124 09:19:24.602109 1701291 command_runner.go:130] >       "setupSerially": false,
	I1124 09:19:24.602114 1701291 command_runner.go:130] >       "useInternalLoopback": false
	I1124 09:19:24.602120 1701291 command_runner.go:130] >     },
	I1124 09:19:24.602126 1701291 command_runner.go:130] >     "containerd": {
	I1124 09:19:24.602132 1701291 command_runner.go:130] >       "defaultRuntimeName": "runc",
	I1124 09:19:24.602137 1701291 command_runner.go:130] >       "ignoreBlockIONotEnabledErrors": false,
	I1124 09:19:24.602145 1701291 command_runner.go:130] >       "ignoreRdtNotEnabledErrors": false,
	I1124 09:19:24.602149 1701291 command_runner.go:130] >       "runtimes": {
	I1124 09:19:24.602152 1701291 command_runner.go:130] >         "runc": {
	I1124 09:19:24.602157 1701291 command_runner.go:130] >           "ContainerAnnotations": null,
	I1124 09:19:24.602163 1701291 command_runner.go:130] >           "PodAnnotations": null,
	I1124 09:19:24.602169 1701291 command_runner.go:130] >           "baseRuntimeSpec": "",
	I1124 09:19:24.602174 1701291 command_runner.go:130] >           "cgroupWritable": false,
	I1124 09:19:24.602179 1701291 command_runner.go:130] >           "cniConfDir": "",
	I1124 09:19:24.602185 1701291 command_runner.go:130] >           "cniMaxConfNum": 0,
	I1124 09:19:24.602190 1701291 command_runner.go:130] >           "io_type": "",
	I1124 09:19:24.602195 1701291 command_runner.go:130] >           "options": {
	I1124 09:19:24.602200 1701291 command_runner.go:130] >             "BinaryName": "",
	I1124 09:19:24.602212 1701291 command_runner.go:130] >             "CriuImagePath": "",
	I1124 09:19:24.602217 1701291 command_runner.go:130] >             "CriuWorkPath": "",
	I1124 09:19:24.602221 1701291 command_runner.go:130] >             "IoGid": 0,
	I1124 09:19:24.602226 1701291 command_runner.go:130] >             "IoUid": 0,
	I1124 09:19:24.602232 1701291 command_runner.go:130] >             "NoNewKeyring": false,
	I1124 09:19:24.602237 1701291 command_runner.go:130] >             "Root": "",
	I1124 09:19:24.602243 1701291 command_runner.go:130] >             "ShimCgroup": "",
	I1124 09:19:24.602248 1701291 command_runner.go:130] >             "SystemdCgroup": false
	I1124 09:19:24.602252 1701291 command_runner.go:130] >           },
	I1124 09:19:24.602257 1701291 command_runner.go:130] >           "privileged_without_host_devices": false,
	I1124 09:19:24.602266 1701291 command_runner.go:130] >           "privileged_without_host_devices_all_devices_allowed": false,
	I1124 09:19:24.602272 1701291 command_runner.go:130] >           "runtimePath": "",
	I1124 09:19:24.602278 1701291 command_runner.go:130] >           "runtimeType": "io.containerd.runc.v2",
	I1124 09:19:24.602285 1701291 command_runner.go:130] >           "sandboxer": "podsandbox",
	I1124 09:19:24.602290 1701291 command_runner.go:130] >           "snapshotter": ""
	I1124 09:19:24.602293 1701291 command_runner.go:130] >         }
	I1124 09:19:24.602296 1701291 command_runner.go:130] >       }
	I1124 09:19:24.602299 1701291 command_runner.go:130] >     },
	I1124 09:19:24.602309 1701291 command_runner.go:130] >     "containerdEndpoint": "/run/containerd/containerd.sock",
	I1124 09:19:24.602332 1701291 command_runner.go:130] >     "containerdRootDir": "/var/lib/containerd",
	I1124 09:19:24.602339 1701291 command_runner.go:130] >     "device_ownership_from_security_context": false,
	I1124 09:19:24.602344 1701291 command_runner.go:130] >     "disableApparmor": false,
	I1124 09:19:24.602351 1701291 command_runner.go:130] >     "disableHugetlbController": true,
	I1124 09:19:24.602355 1701291 command_runner.go:130] >     "disableProcMount": false,
	I1124 09:19:24.602362 1701291 command_runner.go:130] >     "drainExecSyncIOTimeout": "0s",
	I1124 09:19:24.602366 1701291 command_runner.go:130] >     "enableCDI": true,
	I1124 09:19:24.602378 1701291 command_runner.go:130] >     "enableSelinux": false,
	I1124 09:19:24.602382 1701291 command_runner.go:130] >     "enableUnprivilegedICMP": true,
	I1124 09:19:24.602387 1701291 command_runner.go:130] >     "enableUnprivilegedPorts": true,
	I1124 09:19:24.602392 1701291 command_runner.go:130] >     "ignoreDeprecationWarnings": null,
	I1124 09:19:24.602403 1701291 command_runner.go:130] >     "ignoreImageDefinedVolumes": false,
	I1124 09:19:24.602408 1701291 command_runner.go:130] >     "maxContainerLogLineSize": 16384,
	I1124 09:19:24.602413 1701291 command_runner.go:130] >     "netnsMountsUnderStateDir": false,
	I1124 09:19:24.602417 1701291 command_runner.go:130] >     "restrictOOMScoreAdj": false,
	I1124 09:19:24.602422 1701291 command_runner.go:130] >     "rootDir": "/var/lib/containerd/io.containerd.grpc.v1.cri",
	I1124 09:19:24.602427 1701291 command_runner.go:130] >     "selinuxCategoryRange": 1024,
	I1124 09:19:24.602432 1701291 command_runner.go:130] >     "stateDir": "/run/containerd/io.containerd.grpc.v1.cri",
	I1124 09:19:24.602438 1701291 command_runner.go:130] >     "tolerateMissingHugetlbController": true,
	I1124 09:19:24.602441 1701291 command_runner.go:130] >     "unsetSeccompProfile": ""
	I1124 09:19:24.602445 1701291 command_runner.go:130] >   },
	I1124 09:19:24.602449 1701291 command_runner.go:130] >   "features": {
	I1124 09:19:24.602492 1701291 command_runner.go:130] >     "supplemental_groups_policy": true
	I1124 09:19:24.602500 1701291 command_runner.go:130] >   },
	I1124 09:19:24.602504 1701291 command_runner.go:130] >   "golang": "go1.24.9",
	I1124 09:19:24.602513 1701291 command_runner.go:130] >   "lastCNILoadStatus": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1124 09:19:24.602527 1701291 command_runner.go:130] >   "lastCNILoadStatus.default": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1124 09:19:24.602532 1701291 command_runner.go:130] >   "runtimeHandlers": [
	I1124 09:19:24.602537 1701291 command_runner.go:130] >     {
	I1124 09:19:24.602541 1701291 command_runner.go:130] >       "features": {
	I1124 09:19:24.602546 1701291 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1124 09:19:24.602550 1701291 command_runner.go:130] >         "user_namespaces": true
	I1124 09:19:24.602555 1701291 command_runner.go:130] >       }
	I1124 09:19:24.602564 1701291 command_runner.go:130] >     },
	I1124 09:19:24.602570 1701291 command_runner.go:130] >     {
	I1124 09:19:24.602575 1701291 command_runner.go:130] >       "features": {
	I1124 09:19:24.602587 1701291 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1124 09:19:24.602592 1701291 command_runner.go:130] >         "user_namespaces": true
	I1124 09:19:24.602595 1701291 command_runner.go:130] >       },
	I1124 09:19:24.602598 1701291 command_runner.go:130] >       "name": "runc"
	I1124 09:19:24.602609 1701291 command_runner.go:130] >     }
	I1124 09:19:24.602612 1701291 command_runner.go:130] >   ],
	I1124 09:19:24.602615 1701291 command_runner.go:130] >   "status": {
	I1124 09:19:24.602619 1701291 command_runner.go:130] >     "conditions": [
	I1124 09:19:24.602623 1701291 command_runner.go:130] >       {
	I1124 09:19:24.602629 1701291 command_runner.go:130] >         "message": "",
	I1124 09:19:24.602633 1701291 command_runner.go:130] >         "reason": "",
	I1124 09:19:24.602637 1701291 command_runner.go:130] >         "status": true,
	I1124 09:19:24.602641 1701291 command_runner.go:130] >         "type": "RuntimeReady"
	I1124 09:19:24.602645 1701291 command_runner.go:130] >       },
	I1124 09:19:24.602648 1701291 command_runner.go:130] >       {
	I1124 09:19:24.602655 1701291 command_runner.go:130] >         "message": "Network plugin returns error: cni plugin not initialized",
	I1124 09:19:24.602662 1701291 command_runner.go:130] >         "reason": "NetworkPluginNotReady",
	I1124 09:19:24.602666 1701291 command_runner.go:130] >         "status": false,
	I1124 09:19:24.602678 1701291 command_runner.go:130] >         "type": "NetworkReady"
	I1124 09:19:24.602682 1701291 command_runner.go:130] >       },
	I1124 09:19:24.602685 1701291 command_runner.go:130] >       {
	I1124 09:19:24.602688 1701291 command_runner.go:130] >         "message": "",
	I1124 09:19:24.602692 1701291 command_runner.go:130] >         "reason": "",
	I1124 09:19:24.602703 1701291 command_runner.go:130] >         "status": true,
	I1124 09:19:24.602709 1701291 command_runner.go:130] >         "type": "ContainerdHasNoDeprecationWarnings"
	I1124 09:19:24.602712 1701291 command_runner.go:130] >       }
	I1124 09:19:24.602715 1701291 command_runner.go:130] >     ]
	I1124 09:19:24.602718 1701291 command_runner.go:130] >   }
	I1124 09:19:24.602721 1701291 command_runner.go:130] > }
	I1124 09:19:24.603033 1701291 cni.go:84] Creating CNI manager for ""
	I1124 09:19:24.603051 1701291 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1124 09:19:24.603074 1701291 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1124 09:19:24.603102 1701291 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-beta.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-291288 NodeName:functional-291288 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt Sta
ticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1124 09:19:24.603228 1701291 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-291288"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-beta.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1124 09:19:24.603309 1701291 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0
	I1124 09:19:24.611119 1701291 command_runner.go:130] > kubeadm
	I1124 09:19:24.611140 1701291 command_runner.go:130] > kubectl
	I1124 09:19:24.611146 1701291 command_runner.go:130] > kubelet
	I1124 09:19:24.611161 1701291 binaries.go:51] Found k8s binaries, skipping transfer
	I1124 09:19:24.611223 1701291 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1124 09:19:24.618883 1701291 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (328 bytes)
	I1124 09:19:24.633448 1701291 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (359 bytes)
	I1124 09:19:24.650072 1701291 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2237 bytes)
	I1124 09:19:24.664688 1701291 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1124 09:19:24.668362 1701291 command_runner.go:130] > 192.168.49.2	control-plane.minikube.internal
	I1124 09:19:24.668996 1701291 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1124 09:19:24.787731 1701291 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1124 09:19:25.630718 1701291 certs.go:69] Setting up /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288 for IP: 192.168.49.2
	I1124 09:19:25.630736 1701291 certs.go:195] generating shared ca certs ...
	I1124 09:19:25.630751 1701291 certs.go:227] acquiring lock for ca certs: {Name:mkbe540a30c4376a351176f7fe6fec044d058b09 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 09:19:25.630878 1701291 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.key
	I1124 09:19:25.630932 1701291 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.key
	I1124 09:19:25.630939 1701291 certs.go:257] generating profile certs ...
	I1124 09:19:25.631060 1701291 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.key
	I1124 09:19:25.631119 1701291 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.key.5acb2515
	I1124 09:19:25.631156 1701291 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.key
	I1124 09:19:25.631166 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I1124 09:19:25.631180 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I1124 09:19:25.631190 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I1124 09:19:25.631200 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I1124 09:19:25.631210 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I1124 09:19:25.631221 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I1124 09:19:25.631231 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I1124 09:19:25.631241 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I1124 09:19:25.631304 1701291 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467.pem (1338 bytes)
	W1124 09:19:25.631338 1701291 certs.go:480] ignoring /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467_empty.pem, impossibly tiny 0 bytes
	I1124 09:19:25.631352 1701291 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem (1671 bytes)
	I1124 09:19:25.631382 1701291 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem (1078 bytes)
	I1124 09:19:25.631410 1701291 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem (1123 bytes)
	I1124 09:19:25.631434 1701291 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem (1679 bytes)
	I1124 09:19:25.631484 1701291 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem (1708 bytes)
	I1124 09:19:25.631512 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:19:25.631529 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467.pem -> /usr/share/ca-certificates/1654467.pem
	I1124 09:19:25.631542 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem -> /usr/share/ca-certificates/16544672.pem
	I1124 09:19:25.632117 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1124 09:19:25.653566 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1124 09:19:25.672677 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1124 09:19:25.692448 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1124 09:19:25.712758 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1124 09:19:25.730246 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1124 09:19:25.748136 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1124 09:19:25.765102 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1124 09:19:25.782676 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1124 09:19:25.800418 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467.pem --> /usr/share/ca-certificates/1654467.pem (1338 bytes)
	I1124 09:19:25.818179 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem --> /usr/share/ca-certificates/16544672.pem (1708 bytes)
	I1124 09:19:25.836420 1701291 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1124 09:19:25.849273 1701291 ssh_runner.go:195] Run: openssl version
	I1124 09:19:25.855675 1701291 command_runner.go:130] > OpenSSL 3.0.17 1 Jul 2025 (Library: OpenSSL 3.0.17 1 Jul 2025)
	I1124 09:19:25.855803 1701291 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1124 09:19:25.864243 1701291 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:19:25.867919 1701291 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Nov 24 08:44 /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:19:25.867982 1701291 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Nov 24 08:44 /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:19:25.868042 1701291 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:19:25.908611 1701291 command_runner.go:130] > b5213941
	I1124 09:19:25.909123 1701291 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1124 09:19:25.916880 1701291 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1654467.pem && ln -fs /usr/share/ca-certificates/1654467.pem /etc/ssl/certs/1654467.pem"
	I1124 09:19:25.925097 1701291 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1654467.pem
	I1124 09:19:25.928711 1701291 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Nov 24 09:10 /usr/share/ca-certificates/1654467.pem
	I1124 09:19:25.928823 1701291 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Nov 24 09:10 /usr/share/ca-certificates/1654467.pem
	I1124 09:19:25.928900 1701291 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1654467.pem
	I1124 09:19:25.969833 1701291 command_runner.go:130] > 51391683
	I1124 09:19:25.970298 1701291 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1654467.pem /etc/ssl/certs/51391683.0"
	I1124 09:19:25.978202 1701291 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/16544672.pem && ln -fs /usr/share/ca-certificates/16544672.pem /etc/ssl/certs/16544672.pem"
	I1124 09:19:25.986297 1701291 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/16544672.pem
	I1124 09:19:25.989958 1701291 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Nov 24 09:10 /usr/share/ca-certificates/16544672.pem
	I1124 09:19:25.990028 1701291 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Nov 24 09:10 /usr/share/ca-certificates/16544672.pem
	I1124 09:19:25.990094 1701291 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/16544672.pem
	I1124 09:19:26.030947 1701291 command_runner.go:130] > 3ec20f2e
	I1124 09:19:26.031428 1701291 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/16544672.pem /etc/ssl/certs/3ec20f2e.0"
	I1124 09:19:26.039972 1701291 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1124 09:19:26.043966 1701291 command_runner.go:130] >   File: /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1124 09:19:26.043995 1701291 command_runner.go:130] >   Size: 1176      	Blocks: 8          IO Block: 4096   regular file
	I1124 09:19:26.044001 1701291 command_runner.go:130] > Device: 259,1	Inode: 1320367     Links: 1
	I1124 09:19:26.044008 1701291 command_runner.go:130] > Access: (0644/-rw-r--r--)  Uid: (    0/    root)   Gid: (    0/    root)
	I1124 09:19:26.044023 1701291 command_runner.go:130] > Access: 2025-11-24 09:15:17.409446871 +0000
	I1124 09:19:26.044028 1701291 command_runner.go:130] > Modify: 2025-11-24 09:11:12.722825550 +0000
	I1124 09:19:26.044034 1701291 command_runner.go:130] > Change: 2025-11-24 09:11:12.722825550 +0000
	I1124 09:19:26.044039 1701291 command_runner.go:130] >  Birth: 2025-11-24 09:11:12.722825550 +0000
	I1124 09:19:26.044132 1701291 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1124 09:19:26.086676 1701291 command_runner.go:130] > Certificate will not expire
	I1124 09:19:26.086876 1701291 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1124 09:19:26.129915 1701291 command_runner.go:130] > Certificate will not expire
	I1124 09:19:26.130020 1701291 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1124 09:19:26.173544 1701291 command_runner.go:130] > Certificate will not expire
	I1124 09:19:26.174084 1701291 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1124 09:19:26.214370 1701291 command_runner.go:130] > Certificate will not expire
	I1124 09:19:26.214874 1701291 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1124 09:19:26.257535 1701291 command_runner.go:130] > Certificate will not expire
	I1124 09:19:26.257999 1701291 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1124 09:19:26.298467 1701291 command_runner.go:130] > Certificate will not expire
	I1124 09:19:26.298937 1701291 kubeadm.go:401] StartCluster: {Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikubeCA
APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false
CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:19:26.299045 1701291 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1124 09:19:26.299146 1701291 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1124 09:19:26.324900 1701291 cri.go:89] found id: ""
	I1124 09:19:26.325047 1701291 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1124 09:19:26.331898 1701291 command_runner.go:130] > /var/lib/kubelet/config.yaml
	I1124 09:19:26.331976 1701291 command_runner.go:130] > /var/lib/kubelet/kubeadm-flags.env
	I1124 09:19:26.331999 1701291 command_runner.go:130] > /var/lib/minikube/etcd:
	I1124 09:19:26.332730 1701291 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1124 09:19:26.332771 1701291 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1124 09:19:26.332851 1701291 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1124 09:19:26.340023 1701291 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1124 09:19:26.340455 1701291 kubeconfig.go:47] verify endpoint returned: get endpoint: "functional-291288" does not appear in /home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 09:19:26.340556 1701291 kubeconfig.go:62] /home/jenkins/minikube-integration/21978-1652607/kubeconfig needs updating (will repair): [kubeconfig missing "functional-291288" cluster setting kubeconfig missing "functional-291288" context setting]
	I1124 09:19:26.340827 1701291 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/kubeconfig: {Name:mk02121ae6148bede61eabf0ed4e1826024715f4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 09:19:26.341245 1701291 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 09:19:26.341410 1701291 kapi.go:59] client config for functional-291288: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.crt", KeyFile:"/home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.key", CAFile:"/home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(ni
l), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb2df0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1124 09:19:26.341966 1701291 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1124 09:19:26.341987 1701291 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1124 09:19:26.341993 1701291 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1124 09:19:26.341999 1701291 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1124 09:19:26.342005 1701291 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1124 09:19:26.342302 1701291 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1124 09:19:26.342404 1701291 cert_rotation.go:141] "Starting client certificate rotation controller" logger="tls-transport-cache"
	I1124 09:19:26.349720 1701291 kubeadm.go:635] The running cluster does not require reconfiguration: 192.168.49.2
	I1124 09:19:26.349757 1701291 kubeadm.go:602] duration metric: took 16.96677ms to restartPrimaryControlPlane
	I1124 09:19:26.349768 1701291 kubeadm.go:403] duration metric: took 50.840633ms to StartCluster
	I1124 09:19:26.349802 1701291 settings.go:142] acquiring lock: {Name:mk6c04793f5fd4f38f92abf4357247f2ccd7fc4e Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 09:19:26.349888 1701291 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 09:19:26.350548 1701291 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/kubeconfig: {Name:mk02121ae6148bede61eabf0ed4e1826024715f4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 09:19:26.350757 1701291 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1124 09:19:26.351051 1701291 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1124 09:19:26.351103 1701291 addons.go:527] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I1124 09:19:26.351171 1701291 addons.go:70] Setting storage-provisioner=true in profile "functional-291288"
	I1124 09:19:26.351184 1701291 addons.go:239] Setting addon storage-provisioner=true in "functional-291288"
	I1124 09:19:26.351210 1701291 host.go:66] Checking if "functional-291288" exists ...
	I1124 09:19:26.351260 1701291 addons.go:70] Setting default-storageclass=true in profile "functional-291288"
	I1124 09:19:26.351281 1701291 addons_storage_classes.go:34] enableOrDisableStorageClasses default-storageclass=true on "functional-291288"
	I1124 09:19:26.351591 1701291 cli_runner.go:164] Run: docker container inspect functional-291288 --format={{.State.Status}}
	I1124 09:19:26.351665 1701291 cli_runner.go:164] Run: docker container inspect functional-291288 --format={{.State.Status}}
	I1124 09:19:26.356026 1701291 out.go:179] * Verifying Kubernetes components...
	I1124 09:19:26.358753 1701291 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1124 09:19:26.386934 1701291 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 09:19:26.387124 1701291 kapi.go:59] client config for functional-291288: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.crt", KeyFile:"/home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.key", CAFile:"/home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(ni
l), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb2df0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1124 09:19:26.387397 1701291 addons.go:239] Setting addon default-storageclass=true in "functional-291288"
	I1124 09:19:26.387423 1701291 host.go:66] Checking if "functional-291288" exists ...
	I1124 09:19:26.387832 1701291 cli_runner.go:164] Run: docker container inspect functional-291288 --format={{.State.Status}}
	I1124 09:19:26.389901 1701291 out.go:179]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1124 09:19:26.395008 1701291 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:26.395037 1701291 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1124 09:19:26.395101 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:26.420232 1701291 addons.go:436] installing /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:26.420253 1701291 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1124 09:19:26.420313 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:26.425570 1701291 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:19:26.456516 1701291 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:19:26.560922 1701291 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1124 09:19:26.576856 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:26.613035 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:27.382844 1701291 node_ready.go:35] waiting up to 6m0s for node "functional-291288" to be "Ready" ...
	I1124 09:19:27.383045 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:27.383222 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:27.383136 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:27.383333 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:27.383470 1701291 retry.go:31] will retry after 330.402351ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:27.383574 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:27.383622 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:27.383641 1701291 retry.go:31] will retry after 362.15201ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:27.383749 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:27.714181 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:27.746972 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:27.795758 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:27.795808 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:27.795853 1701291 retry.go:31] will retry after 486.739155ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:27.825835 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:27.825930 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:27.825968 1701291 retry.go:31] will retry after 300.110995ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:27.884058 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:27.884183 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:27.884499 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:28.126983 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:28.217006 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:28.217052 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:28.217072 1701291 retry.go:31] will retry after 300.765079ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:28.283248 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:28.347318 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:28.347417 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:28.347441 1701291 retry.go:31] will retry after 303.335388ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:28.383528 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:28.383642 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:28.383982 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:28.518292 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:28.580592 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:28.580640 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:28.580660 1701291 retry.go:31] will retry after 1.066338993s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:28.651903 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:28.713844 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:28.713897 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:28.713918 1701291 retry.go:31] will retry after 1.056665241s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:28.884118 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:28.884220 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:28.884569 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:29.383298 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:29.383424 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:29.383770 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:29.383848 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:29.647985 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:29.716805 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:29.720169 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:29.720200 1701291 retry.go:31] will retry after 944.131514ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:29.771443 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:29.838798 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:29.842880 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:29.842911 1701291 retry.go:31] will retry after 1.275018698s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:29.883132 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:29.883209 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:29.883509 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:30.383649 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:30.383776 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:30.384127 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:30.664505 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:30.720036 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:30.723467 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:30.723535 1701291 retry.go:31] will retry after 2.138623105s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:30.883817 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:30.883887 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:30.884224 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:31.118957 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:31.199799 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:31.199840 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:31.199882 1701291 retry.go:31] will retry after 2.182241097s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:31.383252 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:31.383376 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:31.383741 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:31.883141 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:31.883218 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:31.883484 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:31.883535 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:32.383203 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:32.383282 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:32.383615 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:32.863283 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:32.883678 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:32.883784 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:32.884128 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:32.923038 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:32.923079 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:32.923098 1701291 retry.go:31] will retry after 3.572603171s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:33.382308 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:33.383761 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:33.383826 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:33.384119 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:33.453074 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:33.453119 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:33.453141 1701291 retry.go:31] will retry after 3.109489242s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:33.883699 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:33.883773 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:33.884102 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:33.884157 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:34.383924 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:34.383999 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:34.384345 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:34.883591 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:34.883679 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:34.883980 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:35.383814 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:35.383894 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:35.384241 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:35.884036 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:35.884171 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:35.884537 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:35.884594 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:36.383696 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:36.383766 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:36.384025 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:36.496437 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:36.551663 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:36.555562 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:36.555638 1701291 retry.go:31] will retry after 5.073494199s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:36.562783 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:36.628271 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:36.628317 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:36.628342 1701291 retry.go:31] will retry after 5.770336946s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:36.883845 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:36.883918 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:36.884243 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:37.384077 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:37.384153 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:37.384472 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:37.883154 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:37.883226 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:37.883536 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:38.383157 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:38.383232 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:38.383563 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:38.383620 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:38.883187 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:38.883316 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:38.883616 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:39.383889 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:39.383969 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:39.384246 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:39.884093 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:39.884182 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:39.884521 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:40.383195 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:40.383272 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:40.383608 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:40.383663 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:40.883068 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:40.883144 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:40.883421 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:41.383174 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:41.383248 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:41.383670 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:41.630088 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:41.704671 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:41.704728 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:41.704747 1701291 retry.go:31] will retry after 8.448093656s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:41.884076 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:41.884161 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:41.884479 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:42.383803 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:42.383879 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:42.384141 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:42.384184 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:42.399541 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:42.476011 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:42.476071 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:42.476093 1701291 retry.go:31] will retry after 9.502945959s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:42.883588 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:42.883671 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:42.884026 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:43.383828 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:43.383907 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:43.384181 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:43.883670 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:43.883743 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:43.884060 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:44.383696 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:44.383771 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:44.384127 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:44.384222 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:44.883976 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:44.884053 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:44.884413 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:45.383648 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:45.383811 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:45.384197 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:45.883998 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:45.884089 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:45.884467 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:46.383603 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:46.383678 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:46.384022 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:46.883619 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:46.883699 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:46.883981 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:46.884038 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:47.383777 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:47.383855 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:47.384200 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:47.883911 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:47.884016 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:47.884384 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:48.383668 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:48.383739 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:48.384087 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:48.883874 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:48.883952 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:48.884283 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:48.884343 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:49.383082 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:49.383173 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:49.383540 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:49.883082 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:49.883151 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:49.883411 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:50.153986 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:50.216789 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:50.216837 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:50.216857 1701291 retry.go:31] will retry after 12.027560843s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:50.383117 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:50.383226 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:50.383583 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:50.883287 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:50.883368 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:50.883726 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:51.383622 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:51.383710 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:51.384038 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:51.384100 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:51.883690 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:51.883770 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:51.884105 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:51.979351 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:52.048232 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:52.048287 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:52.048307 1701291 retry.go:31] will retry after 5.922680138s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:52.383846 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:52.383926 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:52.384262 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:52.883642 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:52.883714 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:52.884029 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:53.383844 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:53.383917 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:53.384249 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:53.384309 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:53.884029 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:53.884108 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:53.884493 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:54.383680 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:54.383755 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:54.384008 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:54.883852 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:54.883926 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:54.884262 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:55.384060 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:55.384132 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:55.384467 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:55.384528 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:55.883800 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:55.883874 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:55.884153 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:56.383266 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:56.383344 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:56.383682 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:56.883176 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:56.883258 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:56.883607 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:57.383853 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:57.383936 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:57.384284 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:57.884078 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:57.884157 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:57.884542 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:57.884608 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:57.972042 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:58.032393 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:58.036131 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:58.036169 1701291 retry.go:31] will retry after 15.323516146s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:58.383700 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:58.383776 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:58.384074 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:58.883637 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:58.883711 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:58.883992 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:59.383767 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:59.383847 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:59.384170 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:59.883954 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:59.884029 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:59.884364 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:00.386702 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:00.386929 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:00.387350 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:00.387652 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:00.883998 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:00.884089 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:00.884461 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:01.383250 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:01.383328 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:01.383704 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:01.883996 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:01.884068 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:01.884357 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:02.244687 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:20:02.303604 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:20:02.306952 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:02.306992 1701291 retry.go:31] will retry after 20.630907774s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:02.383202 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:02.383281 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:02.383599 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:02.883330 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:02.883410 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:02.883745 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:02.883800 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:03.383311 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:03.383386 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:03.383651 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:03.883196 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:03.883275 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:03.883594 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:04.383175 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:04.383259 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:04.383568 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:04.883120 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:04.883192 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:04.883478 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:05.383217 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:05.383295 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:05.383624 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:05.383680 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:05.883368 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:05.883446 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:05.883773 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:06.383723 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:06.383806 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:06.384068 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:06.883869 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:06.883945 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:06.884264 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:07.384063 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:07.384138 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:07.384462 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:07.384526 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:07.883109 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:07.883188 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:07.883446 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:08.383152 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:08.383224 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:08.383566 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:08.883199 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:08.883279 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:08.883603 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:09.383126 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:09.383212 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:09.383470 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:09.883162 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:09.883254 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:09.883549 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:09.883599 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:10.383190 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:10.383264 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:10.383586 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:10.883817 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:10.883892 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:10.884145 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:11.383273 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:11.383344 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:11.383622 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:11.883313 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:11.883389 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:11.883749 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:11.883806 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:12.383448 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:12.383520 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:12.383791 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:12.883177 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:12.883257 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:12.883572 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:13.360275 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:20:13.383805 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:13.383886 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:13.384154 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:13.423794 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:20:13.423847 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:13.423866 1701291 retry.go:31] will retry after 19.725114159s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:13.884034 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:13.884124 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:13.884430 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:13.884481 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:14.383158 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:14.383258 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:14.383624 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:14.883202 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:14.883284 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:14.883644 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:15.383356 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:15.383435 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:15.383734 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:15.883472 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:15.883549 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:15.883909 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:16.384044 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:16.384118 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:16.384464 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:16.384554 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:16.883205 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:16.883292 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:16.883609 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:17.383212 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:17.383289 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:17.383587 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:17.883207 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:17.883289 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:17.883594 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:18.383758 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:18.383840 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:18.384110 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:18.883984 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:18.884085 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:18.884539 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:18.884620 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:19.383195 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:19.383308 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:19.383679 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:19.883124 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:19.883204 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:19.883474 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:20.383183 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:20.383264 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:20.383612 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:20.883327 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:20.883410 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:20.883750 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:21.383113 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:21.383189 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:21.383447 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:21.383491 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:21.883186 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:21.883258 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:21.883619 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:22.383192 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:22.383277 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:22.383650 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:22.883350 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:22.883422 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:22.883692 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:22.939045 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:20:23.002892 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:20:23.002941 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:23.002963 1701291 retry.go:31] will retry after 24.365576381s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:23.384046 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:23.384125 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:23.384460 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:23.384522 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:23.883216 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:23.883293 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:23.883634 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:24.383833 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:24.383929 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:24.384212 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:24.884088 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:24.884168 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:24.884519 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:25.383227 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:25.383307 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:25.383654 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:25.883912 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:25.883982 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:25.884337 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:25.884396 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:26.383528 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:26.383619 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:26.383952 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:26.883735 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:26.883810 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:26.884149 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:27.383645 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:27.383725 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:27.384079 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:27.883693 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:27.883792 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:27.884080 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:28.383869 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:28.383941 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:28.384276 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:28.384333 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:28.883621 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:28.883696 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:28.884021 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:29.383693 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:29.383768 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:29.384125 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:29.883838 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:29.883920 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:29.884279 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:30.383628 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:30.383705 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:30.383961 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:30.883414 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:30.883492 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:30.883837 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:30.883893 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:31.383689 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:31.383767 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:31.384087 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:31.883629 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:31.883699 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:31.883964 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:32.383819 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:32.383897 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:32.384254 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:32.884067 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:32.884145 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:32.884453 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:32.884504 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:33.149949 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:20:33.204697 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:20:33.208037 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:33.208070 1701291 retry.go:31] will retry after 22.392696015s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:33.383469 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:33.383538 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:33.383796 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:33.883550 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:33.883634 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:33.883947 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:34.383737 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:34.383811 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:34.384171 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:34.883654 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:34.883734 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:34.884066 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:35.383856 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:35.383928 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:35.384271 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:35.384326 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:35.883926 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:35.884005 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:35.884370 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:36.383314 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:36.383384 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:36.383644 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:36.883149 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:36.883225 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:36.883565 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:37.383275 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:37.383359 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:37.383702 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:37.883387 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:37.883466 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:37.883722 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:37.883762 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:38.383178 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:38.383252 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:38.383603 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:38.883170 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:38.883244 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:38.883650 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:39.383205 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:39.383274 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:39.383534 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:39.883192 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:39.883267 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:39.883632 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:40.383376 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:40.383463 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:40.383839 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:40.383896 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:40.883093 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:40.883170 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:40.883479 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:41.383194 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:41.383276 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:41.383635 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:41.883337 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:41.883422 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:41.883716 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:42.383386 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:42.383461 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:42.383814 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:42.883190 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:42.883266 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:42.883601 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:42.883670 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:43.383217 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:43.383293 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:43.383671 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:43.883117 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:43.883198 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:43.883473 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:44.383174 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:44.383255 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:44.383558 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:44.883289 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:44.883363 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:44.883641 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:45.383065 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:45.383134 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:45.383415 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:45.383456 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:45.883191 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:45.883274 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:45.883563 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:46.383411 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:46.383487 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:46.383849 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:46.883402 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:46.883490 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:46.883752 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:47.369539 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:20:47.383080 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:47.383149 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:47.383440 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:47.383498 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:47.426348 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:20:47.429646 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:47.429686 1701291 retry.go:31] will retry after 22.399494886s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:47.883262 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:47.883365 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:47.883699 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:48.383121 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:48.383192 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:48.383450 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:48.883172 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:48.883246 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:48.883565 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:49.383175 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:49.383254 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:49.383619 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:49.383673 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:49.883307 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:49.883381 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:49.883648 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:50.383167 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:50.383242 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:50.383602 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:50.883305 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:50.883403 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:50.883701 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:51.383597 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:51.383671 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:51.383949 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:51.383999 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:51.883799 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:51.883891 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:51.884215 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:52.383953 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:52.384046 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:52.384337 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:52.883622 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:52.883695 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:52.883974 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:53.383750 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:53.383840 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:53.384189 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:53.384246 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:53.883872 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:53.883946 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:53.884278 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:54.383691 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:54.383768 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:54.384062 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:54.883870 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:54.883951 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:54.884279 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:55.384078 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:55.384159 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:55.384531 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:55.384594 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:55.601942 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:20:55.661064 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:20:55.665031 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:20:55.665156 1701291 out.go:285] ! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1124 09:20:55.883471 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:55.883549 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:55.883839 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:56.384006 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:56.384085 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:56.384438 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:56.883159 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:56.883256 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:56.883546 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:57.383058 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:57.383127 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:57.383401 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:57.883132 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:57.883210 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:57.883522 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:57.883572 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:58.383147 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:58.383243 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:58.383538 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:58.883654 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:58.883729 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:58.883987 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:59.383805 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:59.383888 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:59.384179 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:59.883985 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:59.884058 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:59.884355 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:59.884403 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:00.383754 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:00.383833 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:00.384151 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:00.883935 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:00.884016 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:00.884352 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:01.383267 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:01.383344 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:01.383652 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:01.883147 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:01.883222 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:01.883556 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:02.383255 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:02.383332 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:02.383663 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:02.383721 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:02.883448 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:02.883530 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:02.883895 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:03.383623 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:03.383692 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:03.383959 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:03.883727 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:03.883833 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:03.884183 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:04.383989 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:04.384068 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:04.384431 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:04.384491 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:04.883658 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:04.883737 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:04.884051 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:05.383792 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:05.383863 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:05.384221 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:05.883873 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:05.883951 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:05.884288 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:06.383270 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:06.383343 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:06.383618 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:06.883169 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:06.883268 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:06.883573 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:06.883620 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:07.383342 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:07.383427 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:07.383765 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:07.884027 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:07.884094 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:07.884425 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:08.383164 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:08.383239 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:08.383598 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:08.883308 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:08.883398 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:08.883741 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:08.883802 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:09.383098 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:09.383166 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:09.383423 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:09.830147 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:21:09.883707 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:09.883815 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:09.884234 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:09.887265 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:21:09.890761 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:21:09.890861 1701291 out.go:285] ! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1124 09:21:09.895836 1701291 out.go:179] * Enabled addons: 
	I1124 09:21:09.899594 1701291 addons.go:530] duration metric: took 1m43.548488453s for enable addons: enabled=[]
	I1124 09:21:10.383381 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:10.383468 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:10.383851 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:10.883541 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:10.883612 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:10.883871 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:10.883921 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:11.383721 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:11.383804 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:11.384146 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:11.883758 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:11.883832 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:11.884153 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:12.383650 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:12.383725 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:12.383994 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:12.883791 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:12.883869 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:12.884200 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:12.884259 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:13.384051 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:13.384130 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:13.384481 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:13.883069 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:13.883147 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:13.883443 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:14.383158 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:14.383256 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:14.383600 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:14.883308 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:14.883386 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:14.883743 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:15.383457 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:15.383524 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:15.383790 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:15.383833 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:15.883160 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:15.883235 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:15.883570 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:16.383347 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:16.383428 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:16.383759 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:16.883325 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:16.883399 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:16.883664 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:17.383191 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:17.383290 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:17.383661 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:17.883228 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:17.883306 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:17.883672 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:17.883730 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:18.383978 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:18.384061 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:18.384373 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:18.883112 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:18.883209 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:18.883544 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:19.383112 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:19.383198 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:19.383544 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:19.883660 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:19.883735 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:19.883994 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:19.884034 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:20.383840 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:20.383939 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:20.384276 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:20.884063 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:20.884139 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:20.884609 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:21.383122 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:21.383191 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:21.383474 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:21.883229 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:21.883311 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:21.883643 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:22.383182 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:22.383259 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:22.383610 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:22.383663 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:22.884007 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:22.884077 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:22.884343 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:23.384148 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:23.384238 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:23.384581 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:23.883132 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:23.883207 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:23.883554 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:24.383088 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:24.383159 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:24.383481 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:24.883179 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:24.883275 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:24.883610 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:24.883675 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:25.383186 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:25.383268 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:25.383608 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:25.883472 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:25.883586 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:25.884129 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:26.383146 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:26.383230 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:26.383577 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:26.883188 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:26.883299 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:26.883678 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:26.883758 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:27.384111 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:27.384181 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:27.384491 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:27.883092 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:27.883171 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:27.883515 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:28.383158 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:28.383240 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:28.383623 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:28.883309 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:28.883385 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:28.883717 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:29.383191 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:29.383266 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:29.383650 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:29.383702 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:29.883188 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:29.883275 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:29.883613 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:30.383126 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:30.383193 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:30.383490 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:30.883171 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:30.883254 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:30.883605 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:31.383171 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:31.383250 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:31.383601 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:31.883122 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:31.883191 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:31.883449 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:31.883489 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:32.383217 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:32.383291 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:32.383620 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:32.883183 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:32.883260 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:32.883629 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:33.383180 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:33.383262 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:33.383549 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:33.883190 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:33.883271 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:33.883638 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:33.883694 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:34.383256 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:34.383337 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:34.383680 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:34.883132 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:34.883214 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:34.883526 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:35.383172 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:35.383252 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:35.383615 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:35.883199 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:35.883282 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:35.883582 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:36.383281 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:36.383351 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:36.383609 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:36.383650 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:36.883304 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:36.883386 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:36.883706 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:37.383434 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:37.383512 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:37.383858 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:37.883555 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:37.883635 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:37.883920 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:38.383713 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:38.383800 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:38.384150 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:38.384211 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:38.884005 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:38.884085 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:38.884432 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:39.383117 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:39.383189 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:39.383470 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:39.883210 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:39.883320 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:39.883681 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:40.383192 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:40.383273 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:40.383648 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:40.883329 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:40.883413 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:40.883677 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:40.883719 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:41.383810 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:41.383891 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:41.384260 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:41.884110 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:41.884211 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:41.884610 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:42.383111 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:42.383184 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:42.383469 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:42.883146 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:42.883219 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:42.883556 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:43.383303 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:43.383390 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:43.383815 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:43.383880 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:43.884150 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:43.884225 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:43.884489 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:44.383187 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:44.383285 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:44.383631 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:44.883347 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:44.883424 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:44.883787 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:45.383143 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:45.383221 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:45.383485 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:45.883220 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:45.883291 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:45.883631 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:45.883683 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:46.383565 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:46.383643 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:46.384005 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:46.883681 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:46.883753 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:46.884095 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:47.383931 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:47.384032 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:47.384438 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:47.884098 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:47.884173 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:47.884475 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:47.884521 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:48.383141 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:48.383214 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:48.383504 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:48.883189 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:48.883295 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:48.883641 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:49.383237 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:49.383316 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:49.383651 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:49.883138 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:49.883214 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:49.883514 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:50.383163 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:50.383242 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:50.383592 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:50.383651 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:50.883194 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:50.883284 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:50.883599 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:51.383074 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:51.383155 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:51.383436 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:51.883141 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:51.883231 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:51.883582 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:52.383155 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:52.383242 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:52.383575 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:52.883252 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:52.883327 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:52.883595 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:52.883642 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:53.383321 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:53.383392 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:53.383737 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:53.883216 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:53.883293 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:53.883646 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:54.383339 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:54.383413 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:54.383688 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:54.883201 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:54.883274 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:54.883590 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:55.383288 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:55.383366 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:55.383704 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:55.383769 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:55.883435 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:55.883505 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:55.883816 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:56.384009 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:56.384088 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:56.384422 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:56.883137 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:56.883222 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:56.883558 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:57.383818 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:57.383897 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:57.384172 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:57.384212 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:57.883977 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:57.884053 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:57.884399 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:58.383153 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:58.383233 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:58.383556 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:58.883101 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:58.883177 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:58.883433 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:59.383157 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:59.383236 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:59.383565 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:59.883168 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:59.883258 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:59.883650 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:59.883705 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:00.383305 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:00.383386 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:00.383837 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:00.883166 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:00.883245 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:00.883577 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:01.383186 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:01.383267 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:01.383606 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:01.883853 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:01.883923 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:01.884206 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:01.884257 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:02.384016 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:02.384095 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:02.384455 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:02.884100 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:02.884181 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:02.884522 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:03.383131 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:03.383207 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:03.383521 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:03.883200 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:03.883287 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:03.883604 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:04.383190 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:04.383274 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:04.383643 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:04.383702 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:04.883207 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:04.883279 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:04.883551 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:05.383188 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:05.383272 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:05.383607 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:05.883177 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:05.883257 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:05.883627 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:06.383792 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:06.383879 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:06.384240 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:06.384291 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:06.884040 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:06.884120 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:06.884445 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:07.383154 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:07.383230 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:07.383566 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:07.883872 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:07.883944 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:07.884212 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:08.383967 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:08.384042 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:08.384363 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:08.384428 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:08.883105 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:08.883184 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:08.883520 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:09.383654 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:09.383727 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:09.384039 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:09.883710 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:09.883788 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:09.884141 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:10.383940 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:10.384022 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:10.384358 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:10.883643 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:10.883717 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:10.883979 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:10.884026 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:11.384043 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:11.384119 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:11.384476 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:11.884107 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:11.884182 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:11.884497 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:12.383080 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:12.383156 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:12.383420 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:12.883114 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:12.883197 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:12.883546 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:13.383148 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:13.383235 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:13.383567 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:13.383626 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:13.883128 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:13.883206 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:13.883519 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:14.383161 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:14.383237 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:14.383589 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:14.883306 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:14.883385 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:14.883735 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:15.384005 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:15.384082 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:15.384357 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:15.384407 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:15.883074 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:15.883147 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:15.883531 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:16.383351 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:16.383433 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:16.383810 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:16.883361 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:16.883437 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:16.883741 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:17.383154 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:17.383240 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:17.383580 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:17.883164 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:17.883241 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:17.883543 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:17.883590 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:18.383110 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:18.383195 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:18.383512 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:18.883210 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:18.883284 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:18.883632 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:19.383181 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:19.383265 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:19.383589 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:19.883083 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:19.883153 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:19.883418 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:20.383720 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:20.383806 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:20.384138 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:20.384189 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:20.883895 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:20.883977 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:20.884383 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:21.383110 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:21.383179 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:21.383449 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:21.883148 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:21.883222 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:21.883554 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:22.383181 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:22.383267 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:22.383745 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:22.883438 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:22.883512 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:22.883827 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:22.883878 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:23.383219 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:23.383300 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:23.383650 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:23.883352 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:23.883439 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:23.883739 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:24.383094 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:24.383172 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:24.383441 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:24.883170 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:24.883246 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:24.883573 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:25.383180 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:25.383258 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:25.383557 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:25.383602 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:25.883124 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:25.883200 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:25.883530 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:26.383418 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:26.383502 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:26.383820 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:26.883156 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:26.883232 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:26.883574 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:27.383253 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:27.383325 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:27.383640 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:27.383695 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:27.883229 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:27.883308 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:27.883663 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:28.383352 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:28.383428 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:28.383771 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:28.883152 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:28.883224 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:28.883533 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:29.383259 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:29.383346 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:29.383718 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:29.383781 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:29.883468 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:29.883551 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:29.883860 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:30.383098 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:30.383174 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:30.383431 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:30.883167 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:30.883258 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:30.883626 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:31.383185 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:31.383269 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:31.383600 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:31.883135 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:31.883200 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:31.883477 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:31.883524 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:32.383251 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:32.383334 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:32.383667 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:32.883186 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:32.883294 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:32.883590 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:33.383124 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:33.383196 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:33.383537 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:33.883238 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:33.883319 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:33.883668 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:33.883725 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:34.383411 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:34.383500 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:34.383842 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:34.883113 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:34.883201 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:34.883459 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:35.383178 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:35.383251 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:35.383573 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:35.883163 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:35.883245 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:35.883570 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:36.383743 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:36.383821 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:36.384077 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:36.384116 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:36.883871 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:36.883954 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:36.884285 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:37.384043 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:37.384116 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:37.384446 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:37.883126 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:37.883195 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:37.883464 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:38.383151 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:38.383233 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:38.383571 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:38.883272 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:38.883352 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:38.883655 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:38.883702 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:39.383349 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:39.383416 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:39.383686 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:39.883194 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:39.883268 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:39.883616 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:40.383355 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:40.383439 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:40.383825 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:40.883050 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:40.883119 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:40.883381 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:41.383178 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:41.383263 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:41.383602 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:41.383659 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:41.883334 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:41.883418 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:41.883737 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:42.383098 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:42.383164 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:42.383505 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:42.883181 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:42.883256 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:42.883607 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:43.383328 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:43.383407 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:43.383779 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:43.383848 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:43.883040 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:43.883108 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:43.883373 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:44.383062 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:44.383137 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:44.383488 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:44.883210 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:44.883294 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:44.883624 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:45.383177 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:45.383283 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:45.383549 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:45.883285 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:45.883371 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:45.883679 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:45.883730 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:46.383910 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:46.383988 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:46.384338 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:46.883634 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:46.883708 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:46.883972 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:47.383794 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:47.383890 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:47.384333 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:47.883084 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:47.883172 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:47.883512 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:48.383207 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:48.383278 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:48.383553 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:48.383599 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:48.883146 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:48.883219 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:48.883545 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:49.383190 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:49.383267 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:49.383618 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:49.883863 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:49.883935 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:49.884201 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:50.384017 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:50.384095 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:50.384461 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:50.384517 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:50.883189 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:50.883269 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:50.883636 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:51.383067 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:51.383134 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:51.383393 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:51.883095 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:51.883170 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:51.883486 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:52.383088 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:52.383168 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:52.383503 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:52.883649 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:52.883715 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:52.883972 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:52.884013 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:53.383510 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:53.383586 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:53.383942 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:53.883728 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:53.883810 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:53.884186 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:54.383720 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:54.383800 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:54.384075 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:54.883881 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:54.883959 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:54.884315 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:54.884374 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:55.383090 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:55.383174 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:55.383511 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:55.883189 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:55.883267 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:55.883536 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:56.383980 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:56.384072 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:56.384430 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:56.883181 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:56.883264 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:56.883592 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:57.383208 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:57.383283 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:57.383562 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:57.383632 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:57.883178 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:57.883262 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:57.883557 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:58.383270 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:58.383366 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:58.383681 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:58.883065 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:58.883142 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:58.883409 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:59.383139 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:59.383281 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:59.383599 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:59.883295 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:59.883368 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:59.883709 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:59.883767 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:00.383455 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:00.383535 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:00.383834 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:00.883728 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:00.883804 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:00.884143 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:01.383926 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:01.384011 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:01.384371 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:01.883658 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:01.883732 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:01.884049 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:01.884099 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:02.383854 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:02.383936 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:02.384276 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:02.883965 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:02.884044 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:02.884421 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:03.383112 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:03.383191 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:03.383460 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:03.883214 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:03.883310 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:03.883701 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:04.383439 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:04.383520 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:04.383845 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:04.383903 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:04.883132 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:04.883203 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:04.883548 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:05.383170 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:05.383248 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:05.383591 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:05.883308 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:05.883386 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:05.883685 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:06.383882 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:06.383959 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:06.384277 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:06.384350 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:06.884102 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:06.884178 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:06.884513 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:07.383085 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:07.383184 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:07.383543 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:07.883883 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:07.883956 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:07.884221 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:08.384050 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:08.384123 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:08.384452 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:08.384509 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:08.883183 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:08.883259 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:08.883584 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:09.383246 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:09.383318 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:09.383640 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:09.883219 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:09.883299 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:09.883648 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:10.383378 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:10.383458 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:10.383753 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:10.883317 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:10.883388 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:10.883680 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:10.883723 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:11.383702 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:11.383803 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:11.384131 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:11.883721 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:11.883799 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:11.884129 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:12.383663 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:12.383738 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:12.384067 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:12.883854 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:12.883940 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:12.884274 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:12.884334 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:13.384116 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:13.384195 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:13.384538 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:13.883793 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:13.883869 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:13.884135 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:14.383911 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:14.383994 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:14.384297 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:14.883958 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:14.884048 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:14.884401 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:14.884456 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:15.383622 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:15.383700 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:15.383974 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:15.883699 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:15.883778 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:15.884117 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:16.384146 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:16.384226 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:16.384578 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:16.883198 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:16.883271 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:16.883565 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:17.383190 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:17.383273 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:17.383627 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:17.383682 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:17.883355 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:17.883436 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:17.883756 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:18.383116 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:18.383185 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:18.383441 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:18.883189 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:18.883268 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:18.883596 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:19.383187 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:19.383269 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:19.383630 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:19.883313 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:19.883385 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:19.883674 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:19.883725 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:20.383174 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:20.383257 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:20.383614 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:20.883340 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:20.883415 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:20.883771 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:21.383646 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:21.383720 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:21.383985 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:21.883845 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:21.883928 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:21.884313 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:21.884368 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:22.383054 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:22.383138 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:22.383471 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:22.883086 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:22.883170 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:22.883470 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:23.383158 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:23.383233 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:23.383562 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:23.883247 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:23.883321 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:23.883637 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:24.383095 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:24.383165 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:24.383431 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:24.383471 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:24.883124 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:24.883205 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:24.883534 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:25.383173 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:25.383251 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:25.383575 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:25.883126 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:25.883196 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:25.883470 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:26.383072 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:26.383157 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:26.383507 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:26.383568 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:26.883510 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:26.883587 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:26.883957 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:27.383649 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:27.383724 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:27.384102 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:27.883942 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:27.884029 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:27.884418 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:28.383174 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:28.383254 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:28.383611 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:28.383665 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:28.883114 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:28.883191 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:28.883456 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:29.383167 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:29.383248 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:29.383597 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:29.883182 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:29.883258 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:29.883579 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:30.383122 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:30.383199 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:30.383527 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:30.883137 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:30.883215 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:30.883514 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:30.883561 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:31.383185 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:31.383260 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:31.383609 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:31.883900 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:31.883970 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:31.884278 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:32.384086 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:32.384160 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:32.384455 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:32.883182 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:32.883259 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:32.883600 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:32.883657 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:33.383111 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:33.383190 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:33.383455 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:33.883174 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:33.883260 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:33.883641 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:34.383362 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:34.383442 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:34.383802 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:34.883106 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:34.883183 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:34.883439 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:35.383143 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:35.383220 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:35.383551 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:35.383604 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:35.883151 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:35.883229 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:35.883562 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:36.383293 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:36.383366 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:36.383619 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:36.883173 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:36.883255 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:36.883580 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:37.383161 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:37.383237 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:37.383584 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:37.383635 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:37.883107 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:37.883182 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:37.883504 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:38.383163 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:38.383248 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:38.383593 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:38.883178 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:38.883257 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:38.883615 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:39.383868 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:39.383940 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:39.384210 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:39.384251 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:39.883999 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:39.884075 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:39.884422 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:40.383152 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:40.383231 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:40.383560 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:40.883278 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:40.883355 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:40.883619 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:41.383462 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:41.383549 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:41.383883 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:41.883473 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:41.883550 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:41.883893 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:41.883952 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:42.383654 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:42.383728 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:42.384013 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:42.883799 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:42.883875 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:42.884236 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:43.384072 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:43.384157 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:43.384486 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:43.883149 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:43.883222 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:43.883524 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:44.383233 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:44.383315 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:44.383652 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:44.383713 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:44.883155 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:44.883243 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:44.883579 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:45.383126 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:45.383203 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:45.383524 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:45.883188 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:45.883264 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:45.883628 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:46.383346 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:46.383429 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:46.383765 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:46.383819 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:46.883878 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:46.883951 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:46.884224 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:47.384060 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:47.384136 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:47.384469 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:47.883170 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:47.883249 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:47.883589 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:48.383128 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:48.383211 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:48.383474 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:48.883184 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:48.883264 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:48.883602 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:48.883663 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:49.383164 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:49.383241 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:49.383569 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:49.883290 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:49.883367 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:49.883671 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:50.383185 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:50.383268 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:50.383648 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:50.883431 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:50.883514 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:50.883850 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:50.883909 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:51.383652 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:51.383720 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:51.383978 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:51.883444 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:51.883523 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:51.883866 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:52.383586 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:52.383680 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:52.384026 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:52.883655 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:52.883728 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:52.884053 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:52.884105 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:53.383855 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:53.383945 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:53.384271 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:53.884101 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:53.884186 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:53.884529 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:54.383101 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:54.383176 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:54.383443 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:54.883148 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:54.883222 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:54.883575 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:55.383191 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:55.383270 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:55.383608 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:55.383664 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:55.883870 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:55.883946 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:55.884289 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:56.383256 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:56.383351 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:56.383747 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:56.883462 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:56.883538 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:56.883871 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:57.383563 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:57.383638 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:57.383899 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:57.383944 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:57.883683 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:57.883768 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:57.884147 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:58.383932 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:58.384008 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:58.384395 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:58.883091 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:58.883159 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:58.883412 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:59.383084 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:59.383166 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:59.383498 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:59.883178 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:59.883259 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:59.883595 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:59.883655 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:00.392124 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:00.392210 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:00.392556 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:00.883180 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:00.883282 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:00.883653 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:01.383190 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:01.383267 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:01.383567 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:01.883245 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:01.883313 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:01.883583 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:02.383189 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:02.383267 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:02.383605 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:02.383667 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:02.883233 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:02.883317 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:02.883620 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:03.383856 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:03.383927 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:03.384185 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:03.884056 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:03.884135 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:03.884494 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:04.383223 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:04.383311 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:04.383613 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:04.883276 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:04.883345 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:04.883599 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:04.883643 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:05.383163 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:05.383239 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:05.383541 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:05.883213 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:05.883295 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:05.883634 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:06.383304 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:06.383375 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:06.383679 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:06.883401 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:06.883483 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:06.883806 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:06.883865 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:07.383190 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:07.383271 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:07.383648 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:07.883325 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:07.883398 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:07.883710 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:08.383188 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:08.383266 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:08.383566 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:08.883267 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:08.883345 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:08.883690 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:09.384042 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:09.384118 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:09.384458 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:09.384510 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:09.883180 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:09.883253 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:09.883573 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:10.383180 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:10.383261 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:10.383583 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:10.883127 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:10.883204 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:10.883474 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:11.383167 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:11.383240 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:11.383552 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:11.883157 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:11.883234 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:11.883563 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:11.883618 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:12.383084 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:12.383153 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:12.383411 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:12.883181 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:12.883256 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:12.883591 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:13.383188 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:13.383265 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:13.383586 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:13.883123 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:13.883204 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:13.883485 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:14.383559 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:14.383638 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:14.383953 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:14.384012 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:14.883792 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:14.883868 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:14.884213 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:15.383592 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:15.383667 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:15.383925 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:15.883767 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:15.883843 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:15.884202 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:16.383348 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:16.383420 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:16.383758 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:16.883457 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:16.883538 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:16.883795 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:16.883837 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:17.383161 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:17.383238 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:17.383573 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:17.883186 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:17.883271 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:17.883611 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:18.383302 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:18.383377 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:18.383637 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:18.883191 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:18.883269 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:18.883626 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:19.383147 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:19.383224 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:19.383554 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:19.383616 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:19.883116 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:19.883185 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:19.883449 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:20.383135 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:20.383213 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:20.383531 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:20.883180 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:20.883260 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:20.883559 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:21.383124 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:21.383200 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:21.383460 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:21.883132 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:21.883213 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:21.883553 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:21.883608 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:22.383153 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:22.383241 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:22.383543 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:22.883110 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:22.883179 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:22.883439 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:23.383165 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:23.383246 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:23.383640 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:23.883372 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:23.883448 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:23.883789 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:23.883846 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:24.383112 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:24.383188 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:24.383507 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:24.883200 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:24.883275 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:24.883561 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:25.383261 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:25.383336 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:25.383674 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:25.883358 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:25.883437 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:25.883749 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:26.383290 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:26.383368 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:26.383724 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:26.383783 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:26.883478 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:26.883555 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:26.883888 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:27.383604 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:27.383677 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:27.383939 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:27.883757 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:27.883845 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:27.884167 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:28.383852 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:28.383928 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:28.384269 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:28.384325 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:28.883626 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:28.883692 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:28.883958 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:29.383717 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:29.383796 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:29.384139 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:29.883960 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:29.884036 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:29.884369 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:30.383625 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:30.383694 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:30.383980 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:30.883744 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:30.883816 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:30.884150 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:30.884205 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:31.383977 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:31.384060 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:31.384393 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:31.883624 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:31.883716 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:31.883977 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:32.383741 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:32.383814 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:32.384155 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:32.883968 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:32.884055 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:32.884386 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:32.884443 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:33.383735 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:33.383805 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:33.384072 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:33.883915 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:33.883991 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:33.884369 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:34.383120 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:34.383204 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:34.383566 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:34.883846 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:34.883924 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:34.884224 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:35.383982 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:35.384056 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:35.384427 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:35.384483 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:35.884112 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:35.884192 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:35.884530 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:36.383425 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:36.383499 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:36.383766 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:36.883482 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:36.883565 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:36.883947 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:37.383742 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:37.383819 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:37.384158 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:37.883707 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:37.883774 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:37.884034 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:37.884074 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:38.383850 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:38.383958 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:38.384324 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:38.883075 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:38.883152 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:38.883501 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:39.383112 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:39.383186 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:39.383448 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:39.883223 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:39.883319 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:39.883638 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:40.383373 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:40.383445 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:40.383734 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:40.383787 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:40.883050 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:40.883126 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:40.883428 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:41.383217 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:41.383293 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:41.383634 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:41.883211 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:41.883294 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:41.883578 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:42.383069 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:42.383136 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:42.383390 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:42.883177 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:42.883268 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:42.883564 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:42.883610 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:43.383316 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:43.383402 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:43.383752 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:43.884076 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:43.884150 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:43.884466 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:44.383187 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:44.383282 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:44.383645 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:44.883389 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:44.883464 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:44.883804 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:44.883864 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:45.383117 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:45.383195 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:45.383502 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:45.883172 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:45.883255 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:45.883601 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:46.383362 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:46.383444 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:46.383798 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:46.883132 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:46.883204 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:46.883525 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:47.383245 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:47.383343 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:47.383724 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:47.383787 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:47.883322 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:47.883396 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:47.883705 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:48.383414 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:48.383490 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:48.383778 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:48.883192 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:48.883270 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:48.883613 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:49.383457 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:49.383533 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:49.383864 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:49.383922 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:49.883061 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:49.883134 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:49.883396 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:50.383133 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:50.383215 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:50.383592 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:50.883345 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:50.883424 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:50.883767 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:51.383609 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:51.383687 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:51.383946 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:51.383994 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:51.883714 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:51.883789 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:51.884128 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:52.383943 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:52.384028 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:52.384399 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:52.883710 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:52.883786 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:52.884049 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:53.383826 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:53.383902 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:53.384299 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:53.384353 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:53.883075 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:53.883154 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:53.883549 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:54.383241 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:54.383316 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:54.383579 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:54.883201 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:54.883281 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:54.883627 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:55.383208 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:55.383284 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:55.383613 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:55.883300 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:55.883372 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:55.883626 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:55.883666 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:56.383898 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:56.383987 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:56.384342 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:56.883076 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:56.883152 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:56.883529 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:57.383840 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:57.383919 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:57.384396 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:57.883127 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:57.883220 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:57.883601 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:58.383185 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:58.383263 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:58.383601 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:58.383658 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:58.883103 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:58.883174 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:58.883430 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:59.383161 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:59.383239 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:59.383528 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:59.883235 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:59.883319 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:59.883655 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:00.383085 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:00.383175 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:00.383480 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:00.883287 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:00.883398 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:00.883768 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:00.883828 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:01.383727 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:01.383809 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:01.384138 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:01.883728 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:01.883797 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:01.884120 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:02.383919 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:02.383991 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:02.384291 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:02.884049 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:02.884120 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:02.884420 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:02.884485 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:03.383819 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:03.383888 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:03.384209 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:03.884004 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:03.884091 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:03.884451 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:04.384095 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:04.384179 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:04.384501 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:04.883234 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:04.883303 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:04.883584 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:05.383144 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:05.383220 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:05.383542 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:05.383601 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:05.883200 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:05.883285 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:05.883658 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:06.383258 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:06.383334 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:06.383660 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:06.883258 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:06.883336 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:06.883680 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:07.383404 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:07.383485 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:07.383858 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:07.383913 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:07.883619 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:07.883699 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:07.883964 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:08.383741 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:08.383818 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:08.384168 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:08.883989 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:08.884068 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:08.884393 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:09.384092 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:09.384163 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:09.384427 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:09.384467 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:09.883175 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:09.883251 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:09.883564 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:10.383182 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:10.383261 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:10.383593 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:10.883126 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:10.883201 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:10.883461 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:11.383179 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:11.383273 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:11.383596 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:11.883182 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:11.883257 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:11.883609 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:11.883665 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:12.383318 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:12.383399 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:12.383715 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:12.883177 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:12.883251 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:12.883592 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:13.383299 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:13.383377 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:13.383726 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:13.883393 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:13.883461 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:13.883721 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:13.883763 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:14.383180 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:14.383258 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:14.383605 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:14.883184 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:14.883272 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:14.883660 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:15.383351 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:15.383434 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:15.383700 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:15.883201 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:15.883305 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:15.883711 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:16.383360 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:16.383441 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:16.383809 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:16.383867 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:16.883068 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:16.883136 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:16.883406 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:17.383093 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:17.383175 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:17.383513 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:17.883239 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:17.883322 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:17.883695 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:18.383395 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:18.383464 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:18.383742 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:18.883187 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:18.883264 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:18.883645 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:18.883700 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:19.383197 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:19.383275 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:19.383625 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:19.883335 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:19.883406 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:19.883791 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:20.383200 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:20.383305 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:20.383704 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:20.883416 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:20.883493 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:20.883891 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:20.883947 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:21.383661 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:21.383731 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:21.383987 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:21.883737 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:21.883815 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:21.884385 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:22.383106 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:22.383198 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:22.383565 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:22.883149 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:22.883216 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:22.883512 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:23.383192 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:23.383276 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:23.383626 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:23.383680 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:23.883354 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:23.883454 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:23.883802 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:24.383131 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:24.383205 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:24.383521 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:24.883214 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:24.883298 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:24.883675 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:25.383244 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:25.383317 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:25.383636 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:25.883064 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:25.883143 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:25.883420 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:25.883474 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:26.383164 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:26.383254 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:26.383617 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:26.883333 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:26.883410 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:26.883740 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:27.383124 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:27.383182 1701291 node_ready.go:38] duration metric: took 6m0.000242478s for node "functional-291288" to be "Ready" ...
	I1124 09:25:27.386338 1701291 out.go:203] 
	W1124 09:25:27.389204 1701291 out.go:285] X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	W1124 09:25:27.389224 1701291 out.go:285] * 
	W1124 09:25:27.391374 1701291 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1124 09:25:27.394404 1701291 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.924951340Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.924967094Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.925052084Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.925155182Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.925234518Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.925307536Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.925372423Z" level=info msg="runtime interface created"
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.925429064Z" level=info msg="created NRI interface"
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.925491768Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.925597820Z" level=info msg="Connect containerd service"
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.926112073Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this"
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.927754707Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config"
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.937734275Z" level=info msg="Start subscribing containerd event"
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.937812454Z" level=info msg="Start recovering state"
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.938047057Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.938153774Z" level=info msg=serving... address=/run/containerd/containerd.sock
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.961906907Z" level=info msg="Start event monitor"
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.962097555Z" level=info msg="Start cni network conf syncer for default"
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.962159619Z" level=info msg="Start streaming server"
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.962227353Z" level=info msg="Registered namespace \"k8s.io\" with NRI"
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.962282123Z" level=info msg="runtime interface starting up..."
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.962355650Z" level=info msg="starting plugins..."
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.962422760Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Nov 24 09:19:23 functional-291288 systemd[1]: Started containerd.service - containerd container runtime.
	Nov 24 09:19:23 functional-291288 containerd[5880]: time="2025-11-24T09:19:23.964372736Z" level=info msg="containerd successfully booted in 0.061188s"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:25:31.429132    9218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:25:31.429752    9218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:25:31.431474    9218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:25:31.431993    9218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:25:31.433600    9218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Nov24 08:20] overlayfs: idmapped layers are currently not supported
	[Nov24 08:21] overlayfs: idmapped layers are currently not supported
	[Nov24 08:22] overlayfs: idmapped layers are currently not supported
	[Nov24 08:23] overlayfs: idmapped layers are currently not supported
	[Nov24 08:24] overlayfs: idmapped layers are currently not supported
	[ +19.566509] overlayfs: idmapped layers are currently not supported
	[Nov24 08:25] overlayfs: idmapped layers are currently not supported
	[ +35.934095] overlayfs: idmapped layers are currently not supported
	[Nov24 08:27] overlayfs: idmapped layers are currently not supported
	[Nov24 08:28] overlayfs: idmapped layers are currently not supported
	[Nov24 08:29] overlayfs: idmapped layers are currently not supported
	[Nov24 08:30] overlayfs: idmapped layers are currently not supported
	[Nov24 08:31] overlayfs: idmapped layers are currently not supported
	[ +44.505180] overlayfs: idmapped layers are currently not supported
	[Nov24 08:32] overlayfs: idmapped layers are currently not supported
	[Nov24 08:33] overlayfs: idmapped layers are currently not supported
	[Nov24 08:34] overlayfs: idmapped layers are currently not supported
	[ +21.137877] overlayfs: idmapped layers are currently not supported
	[ +28.724175] overlayfs: idmapped layers are currently not supported
	[Nov24 08:36] overlayfs: idmapped layers are currently not supported
	[Nov24 08:37] overlayfs: idmapped layers are currently not supported
	[Nov24 08:38] overlayfs: idmapped layers are currently not supported
	[  +4.063834] overlayfs: idmapped layers are currently not supported
	[Nov24 08:40] overlayfs: idmapped layers are currently not supported
	[Nov24 08:42] kauditd_printk_skb: 8 callbacks suppressed
	
	
	==> kernel <==
	 09:25:31 up  8:07,  0 user,  load average: 0.48, 0.29, 0.49
	Linux functional-291288 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Nov 24 09:25:28 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:25:28 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 812.
	Nov 24 09:25:28 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:25:28 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:25:28 functional-291288 kubelet[9028]: E1124 09:25:28.944507    9028 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:25:28 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:25:28 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:25:29 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 813.
	Nov 24 09:25:29 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:25:29 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:25:29 functional-291288 kubelet[9092]: E1124 09:25:29.700154    9092 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:25:29 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:25:29 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:25:30 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 814.
	Nov 24 09:25:30 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:25:30 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:25:30 functional-291288 kubelet[9113]: E1124 09:25:30.439156    9113 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:25:30 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:25:30 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:25:31 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 815.
	Nov 24 09:25:31 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:25:31 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:25:31 functional-291288 kubelet[9148]: E1124 09:25:31.190917    9148 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:25:31 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:25:31 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-291288 -n functional-291288
helpers_test.go:262: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-291288 -n functional-291288: exit status 2 (367.271211ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:262: status error: exit status 2 (may be ok)
helpers_test.go:264: "functional-291288" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/KubectlGetPods (2.22s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmd (2.29s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmd
functional_test.go:731: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 kubectl -- --context functional-291288 get pods
functional_test.go:731: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-291288 kubectl -- --context functional-291288 get pods: exit status 1 (105.176048ms)

                                                
                                                
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:734: failed to get pods. args "out/minikube-linux-arm64 -p functional-291288 kubectl -- --context functional-291288 get pods": exit status 1
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmd]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmd]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect functional-291288
helpers_test.go:243: (dbg) docker inspect functional-291288:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52",
	        "Created": "2025-11-24T09:10:51.896020191Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 1695240,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-11-24T09:10:51.968983407Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:572c983e466f1f784136812eef5cc59ac623db764bc7704d3676c4643993fd08",
	        "ResolvConfPath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/hostname",
	        "HostsPath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/hosts",
	        "LogPath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52-json.log",
	        "Name": "/functional-291288",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "functional-291288:/var",
	                "/lib/modules:/lib/modules:ro"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-291288",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52",
	                "LowerDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7-init/diff:/var/lib/docker/overlay2/bf294786c7ade59cb761c7bdcc27045362c3779b00a569b685443f34ade17737/diff",
	                "MergedDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7/merged",
	                "UpperDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7/diff",
	                "WorkDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "volume",
	                "Name": "functional-291288",
	                "Source": "/var/lib/docker/volumes/functional-291288/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            },
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-291288",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-291288",
	                "name.minikube.sigs.k8s.io": "functional-291288",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "09c1c2eef0dca6362dde63b4cbc372c0cfa3e4fd084b8745043d8b88925691bf",
	            "SandboxKey": "/var/run/docker/netns/09c1c2eef0dc",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34684"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34685"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34688"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34686"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34687"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-291288": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "7e:49:22:0b:f9:2c",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "1e8f91e8ad9f46b831bbb1b0589b0022d940ee9875e64a648dc80612f3ca93dc",
	                    "EndpointID": "5de5ca8ccb07584b21e6e4e30dba12e0233e8d28c3e48e705cddffe75263b337",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-291288",
	                        "70848be15fcc"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-291288 -n functional-291288
helpers_test.go:247: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-291288 -n functional-291288: exit status 2 (322.73616ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:247: status error: exit status 2 (may be ok)
helpers_test.go:252: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmd FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmd]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 logs -n 25
helpers_test.go:260: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmd logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                          ARGS                                                                           │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ image   │ functional-941011 image ls --format short --alsologtostderr                                                                                             │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ image   │ functional-941011 image ls --format yaml --alsologtostderr                                                                                              │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ ssh     │ functional-941011 ssh pgrep buildkitd                                                                                                                   │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │                     │
	│ image   │ functional-941011 image build -t localhost/my-image:functional-941011 testdata/build --alsologtostderr                                                  │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ image   │ functional-941011 image ls                                                                                                                              │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ image   │ functional-941011 image ls --format json --alsologtostderr                                                                                              │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ image   │ functional-941011 image ls --format table --alsologtostderr                                                                                             │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ delete  │ -p functional-941011                                                                                                                                    │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:10 UTC │ 24 Nov 25 09:10 UTC │
	│ start   │ -p functional-291288 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0 │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:10 UTC │                     │
	│ start   │ -p functional-291288 --alsologtostderr -v=8                                                                                                             │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:19 UTC │                     │
	│ cache   │ functional-291288 cache add registry.k8s.io/pause:3.1                                                                                                   │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ cache   │ functional-291288 cache add registry.k8s.io/pause:3.3                                                                                                   │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ cache   │ functional-291288 cache add registry.k8s.io/pause:latest                                                                                                │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ cache   │ functional-291288 cache add minikube-local-cache-test:functional-291288                                                                                 │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ cache   │ functional-291288 cache delete minikube-local-cache-test:functional-291288                                                                              │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.3                                                                                                                        │ minikube          │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ cache   │ list                                                                                                                                                    │ minikube          │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ ssh     │ functional-291288 ssh sudo crictl images                                                                                                                │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ ssh     │ functional-291288 ssh sudo crictl rmi registry.k8s.io/pause:latest                                                                                      │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ ssh     │ functional-291288 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                                 │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │                     │
	│ cache   │ functional-291288 cache reload                                                                                                                          │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ ssh     │ functional-291288 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                                 │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.1                                                                                                                        │ minikube          │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ cache   │ delete registry.k8s.io/pause:latest                                                                                                                     │ minikube          │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ kubectl │ functional-291288 kubectl -- --context functional-291288 get pods                                                                                       │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │                     │
	└─────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/11/24 09:19:20
	Running on machine: ip-172-31-30-239
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1124 09:19:20.929895 1701291 out.go:360] Setting OutFile to fd 1 ...
	I1124 09:19:20.930102 1701291 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:19:20.930128 1701291 out.go:374] Setting ErrFile to fd 2...
	I1124 09:19:20.930149 1701291 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:19:20.930488 1701291 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
	I1124 09:19:20.930883 1701291 out.go:368] Setting JSON to false
	I1124 09:19:20.931751 1701291 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":28890,"bootTime":1763947071,"procs":158,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1124 09:19:20.931843 1701291 start.go:143] virtualization:  
	I1124 09:19:20.938521 1701291 out.go:179] * [functional-291288] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1124 09:19:20.941571 1701291 out.go:179]   - MINIKUBE_LOCATION=21978
	I1124 09:19:20.941660 1701291 notify.go:221] Checking for updates...
	I1124 09:19:20.947508 1701291 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1124 09:19:20.950282 1701291 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 09:19:20.953189 1701291 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21978-1652607/.minikube
	I1124 09:19:20.956068 1701291 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1124 09:19:20.958991 1701291 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1124 09:19:20.962273 1701291 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1124 09:19:20.962433 1701291 driver.go:422] Setting default libvirt URI to qemu:///system
	I1124 09:19:20.992476 1701291 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1124 09:19:20.992586 1701291 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 09:19:21.057666 1701291 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-11-24 09:19:21.047762616 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 09:19:21.057787 1701291 docker.go:319] overlay module found
	I1124 09:19:21.060830 1701291 out.go:179] * Using the docker driver based on existing profile
	I1124 09:19:21.063549 1701291 start.go:309] selected driver: docker
	I1124 09:19:21.063567 1701291 start.go:927] validating driver "docker" against &{Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disa
bleCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:19:21.063661 1701291 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1124 09:19:21.063775 1701291 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 09:19:21.121254 1701291 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-11-24 09:19:21.111151392 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 09:19:21.121789 1701291 cni.go:84] Creating CNI manager for ""
	I1124 09:19:21.121863 1701291 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1124 09:19:21.121942 1701291 start.go:353] cluster config:
	{Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local C
ontainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPa
th: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:19:21.125134 1701291 out.go:179] * Starting "functional-291288" primary control-plane node in "functional-291288" cluster
	I1124 09:19:21.127989 1701291 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1124 09:19:21.131005 1701291 out.go:179] * Pulling base image v0.0.48-1763789673-21948 ...
	I1124 09:19:21.133917 1701291 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f in local docker daemon
	I1124 09:19:21.133914 1701291 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1124 09:19:21.154192 1701291 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f in local docker daemon, skipping pull
	I1124 09:19:21.154216 1701291 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f exists in daemon, skipping load
	W1124 09:19:21.197477 1701291 preload.go:144] https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.35.0-beta.0/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	W1124 09:19:21.391690 1701291 preload.go:144] https://github.com/kubernetes-sigs/minikube-preloads/releases/download/v18/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	I1124 09:19:21.391947 1701291 profile.go:143] Saving config to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/config.json ...
	I1124 09:19:21.392070 1701291 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:19:21.392253 1701291 cache.go:243] Successfully downloaded all kic artifacts
	I1124 09:19:21.392304 1701291 start.go:360] acquireMachinesLock for functional-291288: {Name:mk85384dc057570e1f34db593d357cea738652c4 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.392403 1701291 start.go:364] duration metric: took 38.802µs to acquireMachinesLock for "functional-291288"
	I1124 09:19:21.392443 1701291 start.go:96] Skipping create...Using existing machine configuration
	I1124 09:19:21.392463 1701291 fix.go:54] fixHost starting: 
	I1124 09:19:21.392780 1701291 cli_runner.go:164] Run: docker container inspect functional-291288 --format={{.State.Status}}
	I1124 09:19:21.413220 1701291 fix.go:112] recreateIfNeeded on functional-291288: state=Running err=<nil>
	W1124 09:19:21.413254 1701291 fix.go:138] unexpected machine state, will restart: <nil>
	I1124 09:19:21.416439 1701291 out.go:252] * Updating the running docker "functional-291288" container ...
	I1124 09:19:21.416481 1701291 machine.go:94] provisionDockerMachine start ...
	I1124 09:19:21.416565 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:21.444143 1701291 main.go:143] libmachine: Using SSH client type: native
	I1124 09:19:21.444471 1701291 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34684 <nil> <nil>}
	I1124 09:19:21.444480 1701291 main.go:143] libmachine: About to run SSH command:
	hostname
	I1124 09:19:21.581815 1701291 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:19:21.598566 1701291 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-291288
	
	I1124 09:19:21.598592 1701291 ubuntu.go:182] provisioning hostname "functional-291288"
	I1124 09:19:21.598669 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:21.623443 1701291 main.go:143] libmachine: Using SSH client type: native
	I1124 09:19:21.623759 1701291 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34684 <nil> <nil>}
	I1124 09:19:21.623771 1701291 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-291288 && echo "functional-291288" | sudo tee /etc/hostname
	I1124 09:19:21.758572 1701291 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:19:21.799121 1701291 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-291288
	
	I1124 09:19:21.799200 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:21.831127 1701291 main.go:143] libmachine: Using SSH client type: native
	I1124 09:19:21.831435 1701291 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34684 <nil> <nil>}
	I1124 09:19:21.831451 1701291 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-291288' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-291288/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-291288' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1124 09:19:21.919264 1701291 cache.go:107] acquiring lock: {Name:mk22a10f0ce1f3295b61e7e76c455d0494a3e278 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.919300 1701291 cache.go:107] acquiring lock: {Name:mk1cf42e67442503a46c578224bd3cb68bf682d4 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.919365 1701291 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 exists
	I1124 09:19:21.919369 1701291 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 exists
	I1124 09:19:21.919375 1701291 cache.go:96] cache image "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0" took 74.126µs
	I1124 09:19:21.919377 1701291 cache.go:96] cache image "gcr.io/k8s-minikube/storage-provisioner:v5" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5" took 130.274µs
	I1124 09:19:21.919383 1701291 cache.go:80] save to tar file registry.k8s.io/kube-apiserver:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 succeeded
	I1124 09:19:21.919385 1701291 cache.go:80] save to tar file gcr.io/k8s-minikube/storage-provisioner:v5 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 succeeded
	I1124 09:19:21.919395 1701291 cache.go:107] acquiring lock: {Name:mkfdc49c8e68aee34cee0c9d441ae8a4dca675c6 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.919407 1701291 cache.go:107] acquiring lock: {Name:mk85f1502dbb97830776608fb729eb3605e112e6 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.919449 1701291 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 exists
	I1124 09:19:21.919433 1701291 cache.go:107] acquiring lock: {Name:mkdbf38e05e2c47c1a7a906a2236e9e7020a94c5 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.919454 1701291 cache.go:96] cache image "registry.k8s.io/pause:3.10.1" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1" took 48.764µs
	I1124 09:19:21.919460 1701291 cache.go:80] save to tar file registry.k8s.io/pause:3.10.1 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 succeeded
	I1124 09:19:21.919471 1701291 cache.go:107] acquiring lock: {Name:mk46ce3b59d7e062b3dbc8a90fe5b4231f256471 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.919266 1701291 cache.go:107] acquiring lock: {Name:mk80fdbe7cdb5bc17c2a82b4ecfd00214559a435 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.919495 1701291 cache.go:107] acquiring lock: {Name:mk726502cb84c177b2e14fee88512325761511c5 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.919506 1701291 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 exists
	I1124 09:19:21.919511 1701291 cache.go:96] cache image "registry.k8s.io/kube-proxy:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0" took 262.074µs
	I1124 09:19:21.919517 1701291 cache.go:80] save to tar file registry.k8s.io/kube-proxy:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 succeeded
	I1124 09:19:21.919425 1701291 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 exists
	I1124 09:19:21.919525 1701291 cache.go:96] cache image "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0" took 132.661µs
	I1124 09:19:21.919532 1701291 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 exists
	I1124 09:19:21.919476 1701291 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 exists
	I1124 09:19:21.919540 1701291 cache.go:96] cache image "registry.k8s.io/coredns/coredns:v1.13.1" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1" took 48.796µs
	I1124 09:19:21.919547 1701291 cache.go:80] save to tar file registry.k8s.io/coredns/coredns:v1.13.1 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 succeeded
	I1124 09:19:21.919541 1701291 cache.go:96] cache image "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0" took 109.4µs
	I1124 09:19:21.919553 1701291 cache.go:80] save to tar file registry.k8s.io/kube-scheduler:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 succeeded
	I1124 09:19:21.919533 1701291 cache.go:80] save to tar file registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 succeeded
	I1124 09:19:21.919557 1701291 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0 exists
	I1124 09:19:21.919563 1701291 cache.go:96] cache image "registry.k8s.io/etcd:3.5.24-0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0" took 93.482µs
	I1124 09:19:21.919568 1701291 cache.go:80] save to tar file registry.k8s.io/etcd:3.5.24-0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0 succeeded
	I1124 09:19:21.919582 1701291 cache.go:87] Successfully saved all images to host disk.
	I1124 09:19:21.982718 1701291 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1124 09:19:21.982799 1701291 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/21978-1652607/.minikube CaCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/21978-1652607/.minikube}
	I1124 09:19:21.982852 1701291 ubuntu.go:190] setting up certificates
	I1124 09:19:21.982880 1701291 provision.go:84] configureAuth start
	I1124 09:19:21.982954 1701291 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-291288
	I1124 09:19:22.001413 1701291 provision.go:143] copyHostCerts
	I1124 09:19:22.001464 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem
	I1124 09:19:22.001516 1701291 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem, removing ...
	I1124 09:19:22.001530 1701291 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem
	I1124 09:19:22.001614 1701291 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem (1078 bytes)
	I1124 09:19:22.001708 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem
	I1124 09:19:22.001726 1701291 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem, removing ...
	I1124 09:19:22.001731 1701291 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem
	I1124 09:19:22.001757 1701291 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem (1123 bytes)
	I1124 09:19:22.001795 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem
	I1124 09:19:22.001816 1701291 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem, removing ...
	I1124 09:19:22.001820 1701291 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem
	I1124 09:19:22.001845 1701291 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem (1679 bytes)
	I1124 09:19:22.001893 1701291 provision.go:117] generating server cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem org=jenkins.functional-291288 san=[127.0.0.1 192.168.49.2 functional-291288 localhost minikube]
	I1124 09:19:22.129571 1701291 provision.go:177] copyRemoteCerts
	I1124 09:19:22.129639 1701291 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1124 09:19:22.129681 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:22.147944 1701291 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:19:22.254207 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I1124 09:19:22.254271 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1124 09:19:22.271706 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem -> /etc/docker/server.pem
	I1124 09:19:22.271768 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1124 09:19:22.289262 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I1124 09:19:22.289325 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1124 09:19:22.306621 1701291 provision.go:87] duration metric: took 323.706379ms to configureAuth
	I1124 09:19:22.306647 1701291 ubuntu.go:206] setting minikube options for container-runtime
	I1124 09:19:22.306839 1701291 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1124 09:19:22.306847 1701291 machine.go:97] duration metric: took 890.360502ms to provisionDockerMachine
	I1124 09:19:22.306855 1701291 start.go:293] postStartSetup for "functional-291288" (driver="docker")
	I1124 09:19:22.306866 1701291 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1124 09:19:22.306912 1701291 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1124 09:19:22.306953 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:22.324012 1701291 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:19:22.434427 1701291 ssh_runner.go:195] Run: cat /etc/os-release
	I1124 09:19:22.437860 1701291 command_runner.go:130] > PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	I1124 09:19:22.437881 1701291 command_runner.go:130] > NAME="Debian GNU/Linux"
	I1124 09:19:22.437886 1701291 command_runner.go:130] > VERSION_ID="12"
	I1124 09:19:22.437890 1701291 command_runner.go:130] > VERSION="12 (bookworm)"
	I1124 09:19:22.437898 1701291 command_runner.go:130] > VERSION_CODENAME=bookworm
	I1124 09:19:22.437901 1701291 command_runner.go:130] > ID=debian
	I1124 09:19:22.437906 1701291 command_runner.go:130] > HOME_URL="https://www.debian.org/"
	I1124 09:19:22.437910 1701291 command_runner.go:130] > SUPPORT_URL="https://www.debian.org/support"
	I1124 09:19:22.437917 1701291 command_runner.go:130] > BUG_REPORT_URL="https://bugs.debian.org/"
	I1124 09:19:22.437980 1701291 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1124 09:19:22.437995 1701291 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1124 09:19:22.438006 1701291 filesync.go:126] Scanning /home/jenkins/minikube-integration/21978-1652607/.minikube/addons for local assets ...
	I1124 09:19:22.438064 1701291 filesync.go:126] Scanning /home/jenkins/minikube-integration/21978-1652607/.minikube/files for local assets ...
	I1124 09:19:22.438143 1701291 filesync.go:149] local asset: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem -> 16544672.pem in /etc/ssl/certs
	I1124 09:19:22.438150 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem -> /etc/ssl/certs/16544672.pem
	I1124 09:19:22.438232 1701291 filesync.go:149] local asset: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/test/nested/copy/1654467/hosts -> hosts in /etc/test/nested/copy/1654467
	I1124 09:19:22.438236 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/test/nested/copy/1654467/hosts -> /etc/test/nested/copy/1654467/hosts
	I1124 09:19:22.438277 1701291 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/1654467
	I1124 09:19:22.446265 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem --> /etc/ssl/certs/16544672.pem (1708 bytes)
	I1124 09:19:22.463769 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/test/nested/copy/1654467/hosts --> /etc/test/nested/copy/1654467/hosts (40 bytes)
	I1124 09:19:22.481365 1701291 start.go:296] duration metric: took 174.495413ms for postStartSetup
	I1124 09:19:22.481446 1701291 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1124 09:19:22.481495 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:22.498552 1701291 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:19:22.598952 1701291 command_runner.go:130] > 14%
	I1124 09:19:22.599551 1701291 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1124 09:19:22.604050 1701291 command_runner.go:130] > 168G
	I1124 09:19:22.604631 1701291 fix.go:56] duration metric: took 1.212164413s for fixHost
	I1124 09:19:22.604655 1701291 start.go:83] releasing machines lock for "functional-291288", held for 1.212220037s
	I1124 09:19:22.604753 1701291 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-291288
	I1124 09:19:22.621885 1701291 ssh_runner.go:195] Run: cat /version.json
	I1124 09:19:22.621944 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:22.622207 1701291 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1124 09:19:22.622270 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:22.640397 1701291 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:19:22.648463 1701291 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:19:22.746016 1701291 command_runner.go:130] > {"iso_version": "v1.37.0-1763503576-21924", "kicbase_version": "v0.0.48-1763789673-21948", "minikube_version": "v1.37.0", "commit": "2996c7ec74d570fa8ab37e6f4f8813150d0c7473"}
	I1124 09:19:22.746158 1701291 ssh_runner.go:195] Run: systemctl --version
	I1124 09:19:22.840219 1701291 command_runner.go:130] > <a href="https://github.com/kubernetes/registry.k8s.io">Temporary Redirect</a>.
	I1124 09:19:22.840264 1701291 command_runner.go:130] > systemd 252 (252.39-1~deb12u1)
	I1124 09:19:22.840285 1701291 command_runner.go:130] > +PAM +AUDIT +SELINUX +APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT +QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified
	I1124 09:19:22.840354 1701291 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I1124 09:19:22.844675 1701291 command_runner.go:130] ! stat: cannot statx '/etc/cni/net.d/*loopback.conf*': No such file or directory
	W1124 09:19:22.844725 1701291 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1124 09:19:22.844793 1701291 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1124 09:19:22.852461 1701291 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1124 09:19:22.852484 1701291 start.go:496] detecting cgroup driver to use...
	I1124 09:19:22.852517 1701291 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1124 09:19:22.852584 1701291 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1124 09:19:22.868240 1701291 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1124 09:19:22.881367 1701291 docker.go:218] disabling cri-docker service (if available) ...
	I1124 09:19:22.881470 1701291 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1124 09:19:22.896889 1701291 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1124 09:19:22.910017 1701291 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1124 09:19:23.028071 1701291 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1124 09:19:23.171419 1701291 docker.go:234] disabling docker service ...
	I1124 09:19:23.171539 1701291 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1124 09:19:23.187505 1701291 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1124 09:19:23.201405 1701291 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1124 09:19:23.324426 1701291 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1124 09:19:23.445186 1701291 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1124 09:19:23.457903 1701291 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1124 09:19:23.470553 1701291 command_runner.go:130] > runtime-endpoint: unix:///run/containerd/containerd.sock
	I1124 09:19:23.472034 1701291 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:19:23.623898 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1124 09:19:23.632988 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1124 09:19:23.641976 1701291 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1124 09:19:23.642063 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1124 09:19:23.651244 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1124 09:19:23.660198 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1124 09:19:23.668706 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1124 09:19:23.677261 1701291 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1124 09:19:23.685600 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1124 09:19:23.694593 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1124 09:19:23.703191 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1124 09:19:23.712006 1701291 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1124 09:19:23.718640 1701291 command_runner.go:130] > net.bridge.bridge-nf-call-iptables = 1
	I1124 09:19:23.719691 1701291 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1124 09:19:23.727172 1701291 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1124 09:19:23.844539 1701291 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1124 09:19:23.964625 1701291 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1124 09:19:23.964708 1701291 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1124 09:19:23.969624 1701291 command_runner.go:130] >   File: /run/containerd/containerd.sock
	I1124 09:19:23.969648 1701291 command_runner.go:130] >   Size: 0         	Blocks: 0          IO Block: 4096   socket
	I1124 09:19:23.969655 1701291 command_runner.go:130] > Device: 0,72	Inode: 1619        Links: 1
	I1124 09:19:23.969671 1701291 command_runner.go:130] > Access: (0660/srw-rw----)  Uid: (    0/    root)   Gid: (    0/    root)
	I1124 09:19:23.969685 1701291 command_runner.go:130] > Access: 2025-11-24 09:19:23.931843190 +0000
	I1124 09:19:23.969693 1701291 command_runner.go:130] > Modify: 2025-11-24 09:19:23.931843190 +0000
	I1124 09:19:23.969699 1701291 command_runner.go:130] > Change: 2025-11-24 09:19:23.931843190 +0000
	I1124 09:19:23.969707 1701291 command_runner.go:130] >  Birth: -
	I1124 09:19:23.970283 1701291 start.go:564] Will wait 60s for crictl version
	I1124 09:19:23.970345 1701291 ssh_runner.go:195] Run: which crictl
	I1124 09:19:23.973724 1701291 command_runner.go:130] > /usr/local/bin/crictl
	I1124 09:19:23.974288 1701291 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1124 09:19:23.995301 1701291 command_runner.go:130] > Version:  0.1.0
	I1124 09:19:23.995587 1701291 command_runner.go:130] > RuntimeName:  containerd
	I1124 09:19:23.995841 1701291 command_runner.go:130] > RuntimeVersion:  v2.1.5
	I1124 09:19:23.996049 1701291 command_runner.go:130] > RuntimeApiVersion:  v1
	I1124 09:19:23.998158 1701291 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.1.5
	RuntimeApiVersion:  v1
	I1124 09:19:23.998238 1701291 ssh_runner.go:195] Run: containerd --version
	I1124 09:19:24.020107 1701291 command_runner.go:130] > containerd containerd.io v2.1.5 fcd43222d6b07379a4be9786bda52438f0dd16a1
	I1124 09:19:24.020449 1701291 ssh_runner.go:195] Run: containerd --version
	I1124 09:19:24.041776 1701291 command_runner.go:130] > containerd containerd.io v2.1.5 fcd43222d6b07379a4be9786bda52438f0dd16a1
	I1124 09:19:24.047417 1701291 out.go:179] * Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	I1124 09:19:24.050497 1701291 cli_runner.go:164] Run: docker network inspect functional-291288 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1124 09:19:24.067531 1701291 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1124 09:19:24.071507 1701291 command_runner.go:130] > 192.168.49.1	host.minikube.internal
	I1124 09:19:24.071622 1701291 kubeadm.go:884] updating cluster {Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikub
eCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:fal
se CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1124 09:19:24.071797 1701291 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:19:24.253230 1701291 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:19:24.402285 1701291 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:19:24.552419 1701291 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1124 09:19:24.552515 1701291 ssh_runner.go:195] Run: sudo crictl images --output json
	I1124 09:19:24.577200 1701291 command_runner.go:130] > {
	I1124 09:19:24.577221 1701291 command_runner.go:130] >   "images":  [
	I1124 09:19:24.577226 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577235 1701291 command_runner.go:130] >       "id":  "sha256:66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51",
	I1124 09:19:24.577240 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577245 1701291 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1124 09:19:24.577248 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577252 1701291 command_runner.go:130] >       "repoDigests":  [],
	I1124 09:19:24.577256 1701291 command_runner.go:130] >       "size":  "8032639",
	I1124 09:19:24.577264 1701291 command_runner.go:130] >       "username":  "",
	I1124 09:19:24.577269 1701291 command_runner.go:130] >       "pinned":  false
	I1124 09:19:24.577272 1701291 command_runner.go:130] >     },
	I1124 09:19:24.577276 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577283 1701291 command_runner.go:130] >       "id":  "sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1124 09:19:24.577290 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577296 1701291 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1124 09:19:24.577299 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577308 1701291 command_runner.go:130] >       "repoDigests":  [],
	I1124 09:19:24.577330 1701291 command_runner.go:130] >       "size":  "21166088",
	I1124 09:19:24.577335 1701291 command_runner.go:130] >       "username":  "nonroot",
	I1124 09:19:24.577339 1701291 command_runner.go:130] >       "pinned":  false
	I1124 09:19:24.577349 1701291 command_runner.go:130] >     },
	I1124 09:19:24.577357 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577364 1701291 command_runner.go:130] >       "id":  "sha256:1211402d28f5813ed906916bfcdd0a7404c2f9048ef5bb54387a6745bc410eca",
	I1124 09:19:24.577368 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577373 1701291 command_runner.go:130] >         "registry.k8s.io/etcd:3.5.24-0"
	I1124 09:19:24.577376 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577380 1701291 command_runner.go:130] >       "repoDigests":  [],
	I1124 09:19:24.577384 1701291 command_runner.go:130] >       "size":  "21880804",
	I1124 09:19:24.577391 1701291 command_runner.go:130] >       "uid":  {
	I1124 09:19:24.577395 1701291 command_runner.go:130] >         "value":  "0"
	I1124 09:19:24.577400 1701291 command_runner.go:130] >       },
	I1124 09:19:24.577404 1701291 command_runner.go:130] >       "username":  "",
	I1124 09:19:24.577408 1701291 command_runner.go:130] >       "pinned":  false
	I1124 09:19:24.577421 1701291 command_runner.go:130] >     },
	I1124 09:19:24.577424 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577431 1701291 command_runner.go:130] >       "id":  "sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42",
	I1124 09:19:24.577434 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577443 1701291 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.5-0"
	I1124 09:19:24.577450 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577454 1701291 command_runner.go:130] >       "repoDigests":  [
	I1124 09:19:24.577461 1701291 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534"
	I1124 09:19:24.577465 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577469 1701291 command_runner.go:130] >       "size":  "21136588",
	I1124 09:19:24.577472 1701291 command_runner.go:130] >       "uid":  {
	I1124 09:19:24.577479 1701291 command_runner.go:130] >         "value":  "0"
	I1124 09:19:24.577482 1701291 command_runner.go:130] >       },
	I1124 09:19:24.577486 1701291 command_runner.go:130] >       "username":  "",
	I1124 09:19:24.577492 1701291 command_runner.go:130] >       "pinned":  false
	I1124 09:19:24.577495 1701291 command_runner.go:130] >     },
	I1124 09:19:24.577502 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577512 1701291 command_runner.go:130] >       "id":  "sha256:ccd634d9bcc36ac6235e9c86676cd3a02c06afc3788a25f1bbf39ca7d44585f4",
	I1124 09:19:24.577516 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577521 1701291 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-beta.0"
	I1124 09:19:24.577527 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577531 1701291 command_runner.go:130] >       "repoDigests":  [],
	I1124 09:19:24.577535 1701291 command_runner.go:130] >       "size":  "24676285",
	I1124 09:19:24.577538 1701291 command_runner.go:130] >       "uid":  {
	I1124 09:19:24.577541 1701291 command_runner.go:130] >         "value":  "0"
	I1124 09:19:24.577545 1701291 command_runner.go:130] >       },
	I1124 09:19:24.577550 1701291 command_runner.go:130] >       "username":  "",
	I1124 09:19:24.577556 1701291 command_runner.go:130] >       "pinned":  false
	I1124 09:19:24.577560 1701291 command_runner.go:130] >     },
	I1124 09:19:24.577563 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577569 1701291 command_runner.go:130] >       "id":  "sha256:68b5f775f18769fcb77bd8474c80bda2050163b6c66f4551f352b7381b8ca5be",
	I1124 09:19:24.577581 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577586 1701291 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0"
	I1124 09:19:24.577590 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577594 1701291 command_runner.go:130] >       "repoDigests":  [],
	I1124 09:19:24.577605 1701291 command_runner.go:130] >       "size":  "20658969",
	I1124 09:19:24.577608 1701291 command_runner.go:130] >       "uid":  {
	I1124 09:19:24.577612 1701291 command_runner.go:130] >         "value":  "0"
	I1124 09:19:24.577615 1701291 command_runner.go:130] >       },
	I1124 09:19:24.577619 1701291 command_runner.go:130] >       "username":  "",
	I1124 09:19:24.577624 1701291 command_runner.go:130] >       "pinned":  false
	I1124 09:19:24.577629 1701291 command_runner.go:130] >     },
	I1124 09:19:24.577633 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577644 1701291 command_runner.go:130] >       "id":  "sha256:404c2e12861777b763b8feaa316d36680fc68ad308a8d2f6e55f1bb981cdd904",
	I1124 09:19:24.577655 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577660 1701291 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-beta.0"
	I1124 09:19:24.577663 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577667 1701291 command_runner.go:130] >       "repoDigests":  [],
	I1124 09:19:24.577678 1701291 command_runner.go:130] >       "size":  "22428165",
	I1124 09:19:24.577686 1701291 command_runner.go:130] >       "username":  "",
	I1124 09:19:24.577692 1701291 command_runner.go:130] >       "pinned":  false
	I1124 09:19:24.577696 1701291 command_runner.go:130] >     },
	I1124 09:19:24.577706 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577712 1701291 command_runner.go:130] >       "id":  "sha256:16378741539f1be9c6e347d127537d379a6592587b09b4eb47964cb5c43a409b",
	I1124 09:19:24.577716 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577721 1701291 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-beta.0"
	I1124 09:19:24.577724 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577728 1701291 command_runner.go:130] >       "repoDigests":  [],
	I1124 09:19:24.577738 1701291 command_runner.go:130] >       "size":  "15389290",
	I1124 09:19:24.577744 1701291 command_runner.go:130] >       "uid":  {
	I1124 09:19:24.577751 1701291 command_runner.go:130] >         "value":  "0"
	I1124 09:19:24.577754 1701291 command_runner.go:130] >       },
	I1124 09:19:24.577758 1701291 command_runner.go:130] >       "username":  "",
	I1124 09:19:24.577762 1701291 command_runner.go:130] >       "pinned":  false
	I1124 09:19:24.577768 1701291 command_runner.go:130] >     },
	I1124 09:19:24.577771 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577779 1701291 command_runner.go:130] >       "id":  "sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1124 09:19:24.577786 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577791 1701291 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1124 09:19:24.577794 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577797 1701291 command_runner.go:130] >       "repoDigests":  [],
	I1124 09:19:24.577801 1701291 command_runner.go:130] >       "size":  "265458",
	I1124 09:19:24.577805 1701291 command_runner.go:130] >       "uid":  {
	I1124 09:19:24.577809 1701291 command_runner.go:130] >         "value":  "65535"
	I1124 09:19:24.577815 1701291 command_runner.go:130] >       },
	I1124 09:19:24.577819 1701291 command_runner.go:130] >       "username":  "",
	I1124 09:19:24.577824 1701291 command_runner.go:130] >       "pinned":  true
	I1124 09:19:24.577827 1701291 command_runner.go:130] >     }
	I1124 09:19:24.577831 1701291 command_runner.go:130] >   ]
	I1124 09:19:24.577842 1701291 command_runner.go:130] > }
	I1124 09:19:24.577988 1701291 containerd.go:627] all images are preloaded for containerd runtime.
	I1124 09:19:24.578000 1701291 cache_images.go:86] Images are preloaded, skipping loading
	I1124 09:19:24.578012 1701291 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-beta.0 containerd true true} ...
	I1124 09:19:24.578111 1701291 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-291288 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1124 09:19:24.578176 1701291 ssh_runner.go:195] Run: sudo crictl info
	I1124 09:19:24.601872 1701291 command_runner.go:130] > {
	I1124 09:19:24.601895 1701291 command_runner.go:130] >   "cniconfig": {
	I1124 09:19:24.601901 1701291 command_runner.go:130] >     "Networks": [
	I1124 09:19:24.601905 1701291 command_runner.go:130] >       {
	I1124 09:19:24.601909 1701291 command_runner.go:130] >         "Config": {
	I1124 09:19:24.601914 1701291 command_runner.go:130] >           "CNIVersion": "0.3.1",
	I1124 09:19:24.601919 1701291 command_runner.go:130] >           "Name": "cni-loopback",
	I1124 09:19:24.601924 1701291 command_runner.go:130] >           "Plugins": [
	I1124 09:19:24.601927 1701291 command_runner.go:130] >             {
	I1124 09:19:24.601931 1701291 command_runner.go:130] >               "Network": {
	I1124 09:19:24.601935 1701291 command_runner.go:130] >                 "ipam": {},
	I1124 09:19:24.601941 1701291 command_runner.go:130] >                 "type": "loopback"
	I1124 09:19:24.601945 1701291 command_runner.go:130] >               },
	I1124 09:19:24.601958 1701291 command_runner.go:130] >               "Source": "{\"type\":\"loopback\"}"
	I1124 09:19:24.601965 1701291 command_runner.go:130] >             }
	I1124 09:19:24.601969 1701291 command_runner.go:130] >           ],
	I1124 09:19:24.601979 1701291 command_runner.go:130] >           "Source": "{\n\"cniVersion\": \"0.3.1\",\n\"name\": \"cni-loopback\",\n\"plugins\": [{\n  \"type\": \"loopback\"\n}]\n}"
	I1124 09:19:24.601983 1701291 command_runner.go:130] >         },
	I1124 09:19:24.601991 1701291 command_runner.go:130] >         "IFName": "lo"
	I1124 09:19:24.601994 1701291 command_runner.go:130] >       }
	I1124 09:19:24.601997 1701291 command_runner.go:130] >     ],
	I1124 09:19:24.602003 1701291 command_runner.go:130] >     "PluginConfDir": "/etc/cni/net.d",
	I1124 09:19:24.602007 1701291 command_runner.go:130] >     "PluginDirs": [
	I1124 09:19:24.602014 1701291 command_runner.go:130] >       "/opt/cni/bin"
	I1124 09:19:24.602018 1701291 command_runner.go:130] >     ],
	I1124 09:19:24.602026 1701291 command_runner.go:130] >     "PluginMaxConfNum": 1,
	I1124 09:19:24.602030 1701291 command_runner.go:130] >     "Prefix": "eth"
	I1124 09:19:24.602033 1701291 command_runner.go:130] >   },
	I1124 09:19:24.602037 1701291 command_runner.go:130] >   "config": {
	I1124 09:19:24.602041 1701291 command_runner.go:130] >     "cdiSpecDirs": [
	I1124 09:19:24.602048 1701291 command_runner.go:130] >       "/etc/cdi",
	I1124 09:19:24.602051 1701291 command_runner.go:130] >       "/var/run/cdi"
	I1124 09:19:24.602055 1701291 command_runner.go:130] >     ],
	I1124 09:19:24.602069 1701291 command_runner.go:130] >     "cni": {
	I1124 09:19:24.602073 1701291 command_runner.go:130] >       "binDir": "",
	I1124 09:19:24.602076 1701291 command_runner.go:130] >       "binDirs": [
	I1124 09:19:24.602080 1701291 command_runner.go:130] >         "/opt/cni/bin"
	I1124 09:19:24.602083 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.602087 1701291 command_runner.go:130] >       "confDir": "/etc/cni/net.d",
	I1124 09:19:24.602092 1701291 command_runner.go:130] >       "confTemplate": "",
	I1124 09:19:24.602098 1701291 command_runner.go:130] >       "ipPref": "",
	I1124 09:19:24.602103 1701291 command_runner.go:130] >       "maxConfNum": 1,
	I1124 09:19:24.602109 1701291 command_runner.go:130] >       "setupSerially": false,
	I1124 09:19:24.602114 1701291 command_runner.go:130] >       "useInternalLoopback": false
	I1124 09:19:24.602120 1701291 command_runner.go:130] >     },
	I1124 09:19:24.602126 1701291 command_runner.go:130] >     "containerd": {
	I1124 09:19:24.602132 1701291 command_runner.go:130] >       "defaultRuntimeName": "runc",
	I1124 09:19:24.602137 1701291 command_runner.go:130] >       "ignoreBlockIONotEnabledErrors": false,
	I1124 09:19:24.602145 1701291 command_runner.go:130] >       "ignoreRdtNotEnabledErrors": false,
	I1124 09:19:24.602149 1701291 command_runner.go:130] >       "runtimes": {
	I1124 09:19:24.602152 1701291 command_runner.go:130] >         "runc": {
	I1124 09:19:24.602157 1701291 command_runner.go:130] >           "ContainerAnnotations": null,
	I1124 09:19:24.602163 1701291 command_runner.go:130] >           "PodAnnotations": null,
	I1124 09:19:24.602169 1701291 command_runner.go:130] >           "baseRuntimeSpec": "",
	I1124 09:19:24.602174 1701291 command_runner.go:130] >           "cgroupWritable": false,
	I1124 09:19:24.602179 1701291 command_runner.go:130] >           "cniConfDir": "",
	I1124 09:19:24.602185 1701291 command_runner.go:130] >           "cniMaxConfNum": 0,
	I1124 09:19:24.602190 1701291 command_runner.go:130] >           "io_type": "",
	I1124 09:19:24.602195 1701291 command_runner.go:130] >           "options": {
	I1124 09:19:24.602200 1701291 command_runner.go:130] >             "BinaryName": "",
	I1124 09:19:24.602212 1701291 command_runner.go:130] >             "CriuImagePath": "",
	I1124 09:19:24.602217 1701291 command_runner.go:130] >             "CriuWorkPath": "",
	I1124 09:19:24.602221 1701291 command_runner.go:130] >             "IoGid": 0,
	I1124 09:19:24.602226 1701291 command_runner.go:130] >             "IoUid": 0,
	I1124 09:19:24.602232 1701291 command_runner.go:130] >             "NoNewKeyring": false,
	I1124 09:19:24.602237 1701291 command_runner.go:130] >             "Root": "",
	I1124 09:19:24.602243 1701291 command_runner.go:130] >             "ShimCgroup": "",
	I1124 09:19:24.602248 1701291 command_runner.go:130] >             "SystemdCgroup": false
	I1124 09:19:24.602252 1701291 command_runner.go:130] >           },
	I1124 09:19:24.602257 1701291 command_runner.go:130] >           "privileged_without_host_devices": false,
	I1124 09:19:24.602266 1701291 command_runner.go:130] >           "privileged_without_host_devices_all_devices_allowed": false,
	I1124 09:19:24.602272 1701291 command_runner.go:130] >           "runtimePath": "",
	I1124 09:19:24.602278 1701291 command_runner.go:130] >           "runtimeType": "io.containerd.runc.v2",
	I1124 09:19:24.602285 1701291 command_runner.go:130] >           "sandboxer": "podsandbox",
	I1124 09:19:24.602290 1701291 command_runner.go:130] >           "snapshotter": ""
	I1124 09:19:24.602293 1701291 command_runner.go:130] >         }
	I1124 09:19:24.602296 1701291 command_runner.go:130] >       }
	I1124 09:19:24.602299 1701291 command_runner.go:130] >     },
	I1124 09:19:24.602309 1701291 command_runner.go:130] >     "containerdEndpoint": "/run/containerd/containerd.sock",
	I1124 09:19:24.602332 1701291 command_runner.go:130] >     "containerdRootDir": "/var/lib/containerd",
	I1124 09:19:24.602339 1701291 command_runner.go:130] >     "device_ownership_from_security_context": false,
	I1124 09:19:24.602344 1701291 command_runner.go:130] >     "disableApparmor": false,
	I1124 09:19:24.602351 1701291 command_runner.go:130] >     "disableHugetlbController": true,
	I1124 09:19:24.602355 1701291 command_runner.go:130] >     "disableProcMount": false,
	I1124 09:19:24.602362 1701291 command_runner.go:130] >     "drainExecSyncIOTimeout": "0s",
	I1124 09:19:24.602366 1701291 command_runner.go:130] >     "enableCDI": true,
	I1124 09:19:24.602378 1701291 command_runner.go:130] >     "enableSelinux": false,
	I1124 09:19:24.602382 1701291 command_runner.go:130] >     "enableUnprivilegedICMP": true,
	I1124 09:19:24.602387 1701291 command_runner.go:130] >     "enableUnprivilegedPorts": true,
	I1124 09:19:24.602392 1701291 command_runner.go:130] >     "ignoreDeprecationWarnings": null,
	I1124 09:19:24.602403 1701291 command_runner.go:130] >     "ignoreImageDefinedVolumes": false,
	I1124 09:19:24.602408 1701291 command_runner.go:130] >     "maxContainerLogLineSize": 16384,
	I1124 09:19:24.602413 1701291 command_runner.go:130] >     "netnsMountsUnderStateDir": false,
	I1124 09:19:24.602417 1701291 command_runner.go:130] >     "restrictOOMScoreAdj": false,
	I1124 09:19:24.602422 1701291 command_runner.go:130] >     "rootDir": "/var/lib/containerd/io.containerd.grpc.v1.cri",
	I1124 09:19:24.602427 1701291 command_runner.go:130] >     "selinuxCategoryRange": 1024,
	I1124 09:19:24.602432 1701291 command_runner.go:130] >     "stateDir": "/run/containerd/io.containerd.grpc.v1.cri",
	I1124 09:19:24.602438 1701291 command_runner.go:130] >     "tolerateMissingHugetlbController": true,
	I1124 09:19:24.602441 1701291 command_runner.go:130] >     "unsetSeccompProfile": ""
	I1124 09:19:24.602445 1701291 command_runner.go:130] >   },
	I1124 09:19:24.602449 1701291 command_runner.go:130] >   "features": {
	I1124 09:19:24.602492 1701291 command_runner.go:130] >     "supplemental_groups_policy": true
	I1124 09:19:24.602500 1701291 command_runner.go:130] >   },
	I1124 09:19:24.602504 1701291 command_runner.go:130] >   "golang": "go1.24.9",
	I1124 09:19:24.602513 1701291 command_runner.go:130] >   "lastCNILoadStatus": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1124 09:19:24.602527 1701291 command_runner.go:130] >   "lastCNILoadStatus.default": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1124 09:19:24.602532 1701291 command_runner.go:130] >   "runtimeHandlers": [
	I1124 09:19:24.602537 1701291 command_runner.go:130] >     {
	I1124 09:19:24.602541 1701291 command_runner.go:130] >       "features": {
	I1124 09:19:24.602546 1701291 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1124 09:19:24.602550 1701291 command_runner.go:130] >         "user_namespaces": true
	I1124 09:19:24.602555 1701291 command_runner.go:130] >       }
	I1124 09:19:24.602564 1701291 command_runner.go:130] >     },
	I1124 09:19:24.602570 1701291 command_runner.go:130] >     {
	I1124 09:19:24.602575 1701291 command_runner.go:130] >       "features": {
	I1124 09:19:24.602587 1701291 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1124 09:19:24.602592 1701291 command_runner.go:130] >         "user_namespaces": true
	I1124 09:19:24.602595 1701291 command_runner.go:130] >       },
	I1124 09:19:24.602598 1701291 command_runner.go:130] >       "name": "runc"
	I1124 09:19:24.602609 1701291 command_runner.go:130] >     }
	I1124 09:19:24.602612 1701291 command_runner.go:130] >   ],
	I1124 09:19:24.602615 1701291 command_runner.go:130] >   "status": {
	I1124 09:19:24.602619 1701291 command_runner.go:130] >     "conditions": [
	I1124 09:19:24.602623 1701291 command_runner.go:130] >       {
	I1124 09:19:24.602629 1701291 command_runner.go:130] >         "message": "",
	I1124 09:19:24.602633 1701291 command_runner.go:130] >         "reason": "",
	I1124 09:19:24.602637 1701291 command_runner.go:130] >         "status": true,
	I1124 09:19:24.602641 1701291 command_runner.go:130] >         "type": "RuntimeReady"
	I1124 09:19:24.602645 1701291 command_runner.go:130] >       },
	I1124 09:19:24.602648 1701291 command_runner.go:130] >       {
	I1124 09:19:24.602655 1701291 command_runner.go:130] >         "message": "Network plugin returns error: cni plugin not initialized",
	I1124 09:19:24.602662 1701291 command_runner.go:130] >         "reason": "NetworkPluginNotReady",
	I1124 09:19:24.602666 1701291 command_runner.go:130] >         "status": false,
	I1124 09:19:24.602678 1701291 command_runner.go:130] >         "type": "NetworkReady"
	I1124 09:19:24.602682 1701291 command_runner.go:130] >       },
	I1124 09:19:24.602685 1701291 command_runner.go:130] >       {
	I1124 09:19:24.602688 1701291 command_runner.go:130] >         "message": "",
	I1124 09:19:24.602692 1701291 command_runner.go:130] >         "reason": "",
	I1124 09:19:24.602703 1701291 command_runner.go:130] >         "status": true,
	I1124 09:19:24.602709 1701291 command_runner.go:130] >         "type": "ContainerdHasNoDeprecationWarnings"
	I1124 09:19:24.602712 1701291 command_runner.go:130] >       }
	I1124 09:19:24.602715 1701291 command_runner.go:130] >     ]
	I1124 09:19:24.602718 1701291 command_runner.go:130] >   }
	I1124 09:19:24.602721 1701291 command_runner.go:130] > }
	I1124 09:19:24.603033 1701291 cni.go:84] Creating CNI manager for ""
	I1124 09:19:24.603051 1701291 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1124 09:19:24.603074 1701291 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1124 09:19:24.603102 1701291 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-beta.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-291288 NodeName:functional-291288 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt Sta
ticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1124 09:19:24.603228 1701291 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-291288"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-beta.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1124 09:19:24.603309 1701291 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0
	I1124 09:19:24.611119 1701291 command_runner.go:130] > kubeadm
	I1124 09:19:24.611140 1701291 command_runner.go:130] > kubectl
	I1124 09:19:24.611146 1701291 command_runner.go:130] > kubelet
	I1124 09:19:24.611161 1701291 binaries.go:51] Found k8s binaries, skipping transfer
	I1124 09:19:24.611223 1701291 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1124 09:19:24.618883 1701291 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (328 bytes)
	I1124 09:19:24.633448 1701291 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (359 bytes)
	I1124 09:19:24.650072 1701291 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2237 bytes)
	I1124 09:19:24.664688 1701291 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1124 09:19:24.668362 1701291 command_runner.go:130] > 192.168.49.2	control-plane.minikube.internal
	I1124 09:19:24.668996 1701291 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1124 09:19:24.787731 1701291 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1124 09:19:25.630718 1701291 certs.go:69] Setting up /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288 for IP: 192.168.49.2
	I1124 09:19:25.630736 1701291 certs.go:195] generating shared ca certs ...
	I1124 09:19:25.630751 1701291 certs.go:227] acquiring lock for ca certs: {Name:mkbe540a30c4376a351176f7fe6fec044d058b09 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 09:19:25.630878 1701291 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.key
	I1124 09:19:25.630932 1701291 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.key
	I1124 09:19:25.630939 1701291 certs.go:257] generating profile certs ...
	I1124 09:19:25.631060 1701291 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.key
	I1124 09:19:25.631119 1701291 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.key.5acb2515
	I1124 09:19:25.631156 1701291 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.key
	I1124 09:19:25.631166 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I1124 09:19:25.631180 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I1124 09:19:25.631190 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I1124 09:19:25.631200 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I1124 09:19:25.631210 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I1124 09:19:25.631221 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I1124 09:19:25.631231 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I1124 09:19:25.631241 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I1124 09:19:25.631304 1701291 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467.pem (1338 bytes)
	W1124 09:19:25.631338 1701291 certs.go:480] ignoring /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467_empty.pem, impossibly tiny 0 bytes
	I1124 09:19:25.631352 1701291 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem (1671 bytes)
	I1124 09:19:25.631382 1701291 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem (1078 bytes)
	I1124 09:19:25.631410 1701291 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem (1123 bytes)
	I1124 09:19:25.631434 1701291 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem (1679 bytes)
	I1124 09:19:25.631484 1701291 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem (1708 bytes)
	I1124 09:19:25.631512 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:19:25.631529 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467.pem -> /usr/share/ca-certificates/1654467.pem
	I1124 09:19:25.631542 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem -> /usr/share/ca-certificates/16544672.pem
	I1124 09:19:25.632117 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1124 09:19:25.653566 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1124 09:19:25.672677 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1124 09:19:25.692448 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1124 09:19:25.712758 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1124 09:19:25.730246 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1124 09:19:25.748136 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1124 09:19:25.765102 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1124 09:19:25.782676 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1124 09:19:25.800418 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467.pem --> /usr/share/ca-certificates/1654467.pem (1338 bytes)
	I1124 09:19:25.818179 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem --> /usr/share/ca-certificates/16544672.pem (1708 bytes)
	I1124 09:19:25.836420 1701291 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1124 09:19:25.849273 1701291 ssh_runner.go:195] Run: openssl version
	I1124 09:19:25.855675 1701291 command_runner.go:130] > OpenSSL 3.0.17 1 Jul 2025 (Library: OpenSSL 3.0.17 1 Jul 2025)
	I1124 09:19:25.855803 1701291 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1124 09:19:25.864243 1701291 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:19:25.867919 1701291 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Nov 24 08:44 /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:19:25.867982 1701291 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Nov 24 08:44 /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:19:25.868042 1701291 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:19:25.908611 1701291 command_runner.go:130] > b5213941
	I1124 09:19:25.909123 1701291 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1124 09:19:25.916880 1701291 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1654467.pem && ln -fs /usr/share/ca-certificates/1654467.pem /etc/ssl/certs/1654467.pem"
	I1124 09:19:25.925097 1701291 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1654467.pem
	I1124 09:19:25.928711 1701291 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Nov 24 09:10 /usr/share/ca-certificates/1654467.pem
	I1124 09:19:25.928823 1701291 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Nov 24 09:10 /usr/share/ca-certificates/1654467.pem
	I1124 09:19:25.928900 1701291 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1654467.pem
	I1124 09:19:25.969833 1701291 command_runner.go:130] > 51391683
	I1124 09:19:25.970298 1701291 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1654467.pem /etc/ssl/certs/51391683.0"
	I1124 09:19:25.978202 1701291 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/16544672.pem && ln -fs /usr/share/ca-certificates/16544672.pem /etc/ssl/certs/16544672.pem"
	I1124 09:19:25.986297 1701291 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/16544672.pem
	I1124 09:19:25.989958 1701291 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Nov 24 09:10 /usr/share/ca-certificates/16544672.pem
	I1124 09:19:25.990028 1701291 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Nov 24 09:10 /usr/share/ca-certificates/16544672.pem
	I1124 09:19:25.990094 1701291 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/16544672.pem
	I1124 09:19:26.030947 1701291 command_runner.go:130] > 3ec20f2e
	I1124 09:19:26.031428 1701291 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/16544672.pem /etc/ssl/certs/3ec20f2e.0"
	I1124 09:19:26.039972 1701291 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1124 09:19:26.043966 1701291 command_runner.go:130] >   File: /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1124 09:19:26.043995 1701291 command_runner.go:130] >   Size: 1176      	Blocks: 8          IO Block: 4096   regular file
	I1124 09:19:26.044001 1701291 command_runner.go:130] > Device: 259,1	Inode: 1320367     Links: 1
	I1124 09:19:26.044008 1701291 command_runner.go:130] > Access: (0644/-rw-r--r--)  Uid: (    0/    root)   Gid: (    0/    root)
	I1124 09:19:26.044023 1701291 command_runner.go:130] > Access: 2025-11-24 09:15:17.409446871 +0000
	I1124 09:19:26.044028 1701291 command_runner.go:130] > Modify: 2025-11-24 09:11:12.722825550 +0000
	I1124 09:19:26.044034 1701291 command_runner.go:130] > Change: 2025-11-24 09:11:12.722825550 +0000
	I1124 09:19:26.044039 1701291 command_runner.go:130] >  Birth: 2025-11-24 09:11:12.722825550 +0000
	I1124 09:19:26.044132 1701291 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1124 09:19:26.086676 1701291 command_runner.go:130] > Certificate will not expire
	I1124 09:19:26.086876 1701291 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1124 09:19:26.129915 1701291 command_runner.go:130] > Certificate will not expire
	I1124 09:19:26.130020 1701291 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1124 09:19:26.173544 1701291 command_runner.go:130] > Certificate will not expire
	I1124 09:19:26.174084 1701291 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1124 09:19:26.214370 1701291 command_runner.go:130] > Certificate will not expire
	I1124 09:19:26.214874 1701291 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1124 09:19:26.257535 1701291 command_runner.go:130] > Certificate will not expire
	I1124 09:19:26.257999 1701291 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1124 09:19:26.298467 1701291 command_runner.go:130] > Certificate will not expire
	I1124 09:19:26.298937 1701291 kubeadm.go:401] StartCluster: {Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikubeCA
APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false
CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:19:26.299045 1701291 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1124 09:19:26.299146 1701291 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1124 09:19:26.324900 1701291 cri.go:89] found id: ""
	I1124 09:19:26.325047 1701291 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1124 09:19:26.331898 1701291 command_runner.go:130] > /var/lib/kubelet/config.yaml
	I1124 09:19:26.331976 1701291 command_runner.go:130] > /var/lib/kubelet/kubeadm-flags.env
	I1124 09:19:26.331999 1701291 command_runner.go:130] > /var/lib/minikube/etcd:
	I1124 09:19:26.332730 1701291 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1124 09:19:26.332771 1701291 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1124 09:19:26.332851 1701291 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1124 09:19:26.340023 1701291 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1124 09:19:26.340455 1701291 kubeconfig.go:47] verify endpoint returned: get endpoint: "functional-291288" does not appear in /home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 09:19:26.340556 1701291 kubeconfig.go:62] /home/jenkins/minikube-integration/21978-1652607/kubeconfig needs updating (will repair): [kubeconfig missing "functional-291288" cluster setting kubeconfig missing "functional-291288" context setting]
	I1124 09:19:26.340827 1701291 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/kubeconfig: {Name:mk02121ae6148bede61eabf0ed4e1826024715f4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 09:19:26.341245 1701291 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 09:19:26.341410 1701291 kapi.go:59] client config for functional-291288: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.crt", KeyFile:"/home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.key", CAFile:"/home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(ni
l), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb2df0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1124 09:19:26.341966 1701291 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1124 09:19:26.341987 1701291 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1124 09:19:26.341993 1701291 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1124 09:19:26.341999 1701291 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1124 09:19:26.342005 1701291 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1124 09:19:26.342302 1701291 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1124 09:19:26.342404 1701291 cert_rotation.go:141] "Starting client certificate rotation controller" logger="tls-transport-cache"
	I1124 09:19:26.349720 1701291 kubeadm.go:635] The running cluster does not require reconfiguration: 192.168.49.2
	I1124 09:19:26.349757 1701291 kubeadm.go:602] duration metric: took 16.96677ms to restartPrimaryControlPlane
	I1124 09:19:26.349768 1701291 kubeadm.go:403] duration metric: took 50.840633ms to StartCluster
	I1124 09:19:26.349802 1701291 settings.go:142] acquiring lock: {Name:mk6c04793f5fd4f38f92abf4357247f2ccd7fc4e Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 09:19:26.349888 1701291 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 09:19:26.350548 1701291 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/kubeconfig: {Name:mk02121ae6148bede61eabf0ed4e1826024715f4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 09:19:26.350757 1701291 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1124 09:19:26.351051 1701291 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1124 09:19:26.351103 1701291 addons.go:527] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I1124 09:19:26.351171 1701291 addons.go:70] Setting storage-provisioner=true in profile "functional-291288"
	I1124 09:19:26.351184 1701291 addons.go:239] Setting addon storage-provisioner=true in "functional-291288"
	I1124 09:19:26.351210 1701291 host.go:66] Checking if "functional-291288" exists ...
	I1124 09:19:26.351260 1701291 addons.go:70] Setting default-storageclass=true in profile "functional-291288"
	I1124 09:19:26.351281 1701291 addons_storage_classes.go:34] enableOrDisableStorageClasses default-storageclass=true on "functional-291288"
	I1124 09:19:26.351591 1701291 cli_runner.go:164] Run: docker container inspect functional-291288 --format={{.State.Status}}
	I1124 09:19:26.351665 1701291 cli_runner.go:164] Run: docker container inspect functional-291288 --format={{.State.Status}}
	I1124 09:19:26.356026 1701291 out.go:179] * Verifying Kubernetes components...
	I1124 09:19:26.358753 1701291 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1124 09:19:26.386934 1701291 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 09:19:26.387124 1701291 kapi.go:59] client config for functional-291288: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.crt", KeyFile:"/home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.key", CAFile:"/home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(ni
l), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb2df0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1124 09:19:26.387397 1701291 addons.go:239] Setting addon default-storageclass=true in "functional-291288"
	I1124 09:19:26.387423 1701291 host.go:66] Checking if "functional-291288" exists ...
	I1124 09:19:26.387832 1701291 cli_runner.go:164] Run: docker container inspect functional-291288 --format={{.State.Status}}
	I1124 09:19:26.389901 1701291 out.go:179]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1124 09:19:26.395008 1701291 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:26.395037 1701291 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1124 09:19:26.395101 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:26.420232 1701291 addons.go:436] installing /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:26.420253 1701291 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1124 09:19:26.420313 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:26.425570 1701291 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:19:26.456516 1701291 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:19:26.560922 1701291 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1124 09:19:26.576856 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:26.613035 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:27.382844 1701291 node_ready.go:35] waiting up to 6m0s for node "functional-291288" to be "Ready" ...
	I1124 09:19:27.383045 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:27.383222 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:27.383136 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:27.383333 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:27.383470 1701291 retry.go:31] will retry after 330.402351ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:27.383574 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:27.383622 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:27.383641 1701291 retry.go:31] will retry after 362.15201ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:27.383749 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:27.714181 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:27.746972 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:27.795758 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:27.795808 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:27.795853 1701291 retry.go:31] will retry after 486.739155ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:27.825835 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:27.825930 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:27.825968 1701291 retry.go:31] will retry after 300.110995ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:27.884058 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:27.884183 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:27.884499 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:28.126983 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:28.217006 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:28.217052 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:28.217072 1701291 retry.go:31] will retry after 300.765079ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:28.283248 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:28.347318 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:28.347417 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:28.347441 1701291 retry.go:31] will retry after 303.335388ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:28.383528 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:28.383642 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:28.383982 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:28.518292 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:28.580592 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:28.580640 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:28.580660 1701291 retry.go:31] will retry after 1.066338993s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:28.651903 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:28.713844 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:28.713897 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:28.713918 1701291 retry.go:31] will retry after 1.056665241s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:28.884118 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:28.884220 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:28.884569 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:29.383298 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:29.383424 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:29.383770 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:29.383848 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:29.647985 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:29.716805 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:29.720169 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:29.720200 1701291 retry.go:31] will retry after 944.131514ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:29.771443 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:29.838798 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:29.842880 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:29.842911 1701291 retry.go:31] will retry after 1.275018698s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:29.883132 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:29.883209 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:29.883509 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:30.383649 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:30.383776 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:30.384127 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:30.664505 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:30.720036 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:30.723467 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:30.723535 1701291 retry.go:31] will retry after 2.138623105s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:30.883817 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:30.883887 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:30.884224 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:31.118957 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:31.199799 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:31.199840 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:31.199882 1701291 retry.go:31] will retry after 2.182241097s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:31.383252 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:31.383376 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:31.383741 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:31.883141 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:31.883218 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:31.883484 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:31.883535 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:32.383203 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:32.383282 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:32.383615 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:32.863283 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:32.883678 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:32.883784 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:32.884128 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:32.923038 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:32.923079 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:32.923098 1701291 retry.go:31] will retry after 3.572603171s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:33.382308 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:33.383761 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:33.383826 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:33.384119 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:33.453074 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:33.453119 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:33.453141 1701291 retry.go:31] will retry after 3.109489242s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:33.883699 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:33.883773 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:33.884102 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:33.884157 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:34.383924 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:34.383999 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:34.384345 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:34.883591 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:34.883679 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:34.883980 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:35.383814 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:35.383894 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:35.384241 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:35.884036 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:35.884171 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:35.884537 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:35.884594 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:36.383696 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:36.383766 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:36.384025 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:36.496437 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:36.551663 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:36.555562 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:36.555638 1701291 retry.go:31] will retry after 5.073494199s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:36.562783 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:36.628271 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:36.628317 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:36.628342 1701291 retry.go:31] will retry after 5.770336946s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:36.883845 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:36.883918 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:36.884243 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:37.384077 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:37.384153 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:37.384472 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:37.883154 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:37.883226 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:37.883536 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:38.383157 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:38.383232 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:38.383563 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:38.383620 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:38.883187 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:38.883316 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:38.883616 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:39.383889 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:39.383969 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:39.384246 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:39.884093 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:39.884182 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:39.884521 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:40.383195 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:40.383272 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:40.383608 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:40.383663 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:40.883068 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:40.883144 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:40.883421 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:41.383174 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:41.383248 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:41.383670 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:41.630088 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:41.704671 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:41.704728 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:41.704747 1701291 retry.go:31] will retry after 8.448093656s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:41.884076 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:41.884161 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:41.884479 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:42.383803 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:42.383879 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:42.384141 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:42.384184 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:42.399541 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:42.476011 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:42.476071 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:42.476093 1701291 retry.go:31] will retry after 9.502945959s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:42.883588 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:42.883671 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:42.884026 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:43.383828 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:43.383907 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:43.384181 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:43.883670 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:43.883743 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:43.884060 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:44.383696 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:44.383771 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:44.384127 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:44.384222 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:44.883976 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:44.884053 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:44.884413 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:45.383648 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:45.383811 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:45.384197 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:45.883998 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:45.884089 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:45.884467 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:46.383603 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:46.383678 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:46.384022 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:46.883619 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:46.883699 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:46.883981 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:46.884038 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:47.383777 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:47.383855 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:47.384200 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:47.883911 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:47.884016 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:47.884384 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:48.383668 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:48.383739 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:48.384087 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:48.883874 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:48.883952 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:48.884283 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:48.884343 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:49.383082 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:49.383173 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:49.383540 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:49.883082 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:49.883151 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:49.883411 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:50.153986 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:50.216789 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:50.216837 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:50.216857 1701291 retry.go:31] will retry after 12.027560843s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:50.383117 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:50.383226 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:50.383583 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:50.883287 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:50.883368 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:50.883726 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:51.383622 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:51.383710 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:51.384038 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:51.384100 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:51.883690 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:51.883770 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:51.884105 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:51.979351 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:52.048232 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:52.048287 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:52.048307 1701291 retry.go:31] will retry after 5.922680138s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:52.383846 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:52.383926 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:52.384262 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:52.883642 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:52.883714 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:52.884029 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:53.383844 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:53.383917 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:53.384249 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:53.384309 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:53.884029 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:53.884108 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:53.884493 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:54.383680 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:54.383755 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:54.384008 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:54.883852 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:54.883926 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:54.884262 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:55.384060 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:55.384132 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:55.384467 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:55.384528 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:55.883800 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:55.883874 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:55.884153 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:56.383266 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:56.383344 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:56.383682 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:56.883176 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:56.883258 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:56.883607 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:57.383853 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:57.383936 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:57.384284 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:57.884078 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:57.884157 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:57.884542 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:57.884608 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:57.972042 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:58.032393 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:58.036131 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:58.036169 1701291 retry.go:31] will retry after 15.323516146s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:58.383700 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:58.383776 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:58.384074 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:58.883637 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:58.883711 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:58.883992 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:59.383767 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:59.383847 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:59.384170 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:59.883954 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:59.884029 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:59.884364 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:00.386702 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:00.386929 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:00.387350 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:00.387652 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:00.883998 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:00.884089 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:00.884461 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:01.383250 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:01.383328 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:01.383704 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:01.883996 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:01.884068 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:01.884357 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:02.244687 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:20:02.303604 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:20:02.306952 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:02.306992 1701291 retry.go:31] will retry after 20.630907774s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:02.383202 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:02.383281 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:02.383599 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:02.883330 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:02.883410 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:02.883745 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:02.883800 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:03.383311 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:03.383386 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:03.383651 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:03.883196 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:03.883275 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:03.883594 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:04.383175 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:04.383259 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:04.383568 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:04.883120 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:04.883192 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:04.883478 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:05.383217 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:05.383295 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:05.383624 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:05.383680 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:05.883368 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:05.883446 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:05.883773 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:06.383723 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:06.383806 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:06.384068 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:06.883869 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:06.883945 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:06.884264 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:07.384063 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:07.384138 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:07.384462 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:07.384526 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:07.883109 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:07.883188 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:07.883446 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:08.383152 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:08.383224 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:08.383566 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:08.883199 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:08.883279 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:08.883603 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:09.383126 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:09.383212 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:09.383470 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:09.883162 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:09.883254 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:09.883549 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:09.883599 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:10.383190 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:10.383264 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:10.383586 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:10.883817 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:10.883892 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:10.884145 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:11.383273 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:11.383344 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:11.383622 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:11.883313 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:11.883389 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:11.883749 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:11.883806 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:12.383448 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:12.383520 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:12.383791 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:12.883177 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:12.883257 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:12.883572 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:13.360275 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:20:13.383805 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:13.383886 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:13.384154 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:13.423794 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:20:13.423847 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:13.423866 1701291 retry.go:31] will retry after 19.725114159s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:13.884034 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:13.884124 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:13.884430 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:13.884481 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:14.383158 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:14.383258 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:14.383624 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:14.883202 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:14.883284 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:14.883644 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:15.383356 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:15.383435 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:15.383734 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:15.883472 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:15.883549 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:15.883909 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:16.384044 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:16.384118 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:16.384464 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:16.384554 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:16.883205 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:16.883292 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:16.883609 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:17.383212 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:17.383289 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:17.383587 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:17.883207 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:17.883289 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:17.883594 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:18.383758 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:18.383840 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:18.384110 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:18.883984 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:18.884085 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:18.884539 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:18.884620 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:19.383195 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:19.383308 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:19.383679 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:19.883124 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:19.883204 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:19.883474 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:20.383183 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:20.383264 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:20.383612 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:20.883327 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:20.883410 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:20.883750 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:21.383113 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:21.383189 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:21.383447 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:21.383491 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:21.883186 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:21.883258 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:21.883619 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:22.383192 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:22.383277 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:22.383650 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:22.883350 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:22.883422 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:22.883692 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:22.939045 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:20:23.002892 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:20:23.002941 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:23.002963 1701291 retry.go:31] will retry after 24.365576381s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:23.384046 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:23.384125 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:23.384460 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:23.384522 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:23.883216 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:23.883293 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:23.883634 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:24.383833 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:24.383929 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:24.384212 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:24.884088 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:24.884168 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:24.884519 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:25.383227 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:25.383307 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:25.383654 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:25.883912 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:25.883982 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:25.884337 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:25.884396 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:26.383528 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:26.383619 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:26.383952 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:26.883735 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:26.883810 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:26.884149 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:27.383645 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:27.383725 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:27.384079 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:27.883693 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:27.883792 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:27.884080 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:28.383869 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:28.383941 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:28.384276 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:28.384333 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:28.883621 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:28.883696 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:28.884021 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:29.383693 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:29.383768 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:29.384125 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:29.883838 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:29.883920 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:29.884279 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:30.383628 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:30.383705 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:30.383961 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:30.883414 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:30.883492 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:30.883837 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:30.883893 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:31.383689 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:31.383767 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:31.384087 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:31.883629 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:31.883699 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:31.883964 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:32.383819 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:32.383897 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:32.384254 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:32.884067 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:32.884145 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:32.884453 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:32.884504 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:33.149949 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:20:33.204697 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:20:33.208037 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:33.208070 1701291 retry.go:31] will retry after 22.392696015s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:33.383469 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:33.383538 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:33.383796 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:33.883550 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:33.883634 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:33.883947 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:34.383737 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:34.383811 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:34.384171 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:34.883654 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:34.883734 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:34.884066 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:35.383856 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:35.383928 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:35.384271 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:35.384326 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:35.883926 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:35.884005 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:35.884370 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:36.383314 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:36.383384 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:36.383644 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:36.883149 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:36.883225 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:36.883565 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:37.383275 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:37.383359 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:37.383702 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:37.883387 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:37.883466 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:37.883722 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:37.883762 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:38.383178 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:38.383252 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:38.383603 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:38.883170 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:38.883244 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:38.883650 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:39.383205 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:39.383274 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:39.383534 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:39.883192 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:39.883267 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:39.883632 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:40.383376 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:40.383463 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:40.383839 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:40.383896 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:40.883093 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:40.883170 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:40.883479 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:41.383194 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:41.383276 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:41.383635 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:41.883337 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:41.883422 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:41.883716 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:42.383386 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:42.383461 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:42.383814 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:42.883190 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:42.883266 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:42.883601 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:42.883670 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:43.383217 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:43.383293 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:43.383671 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:43.883117 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:43.883198 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:43.883473 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:44.383174 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:44.383255 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:44.383558 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:44.883289 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:44.883363 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:44.883641 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:45.383065 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:45.383134 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:45.383415 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:45.383456 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:45.883191 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:45.883274 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:45.883563 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:46.383411 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:46.383487 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:46.383849 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:46.883402 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:46.883490 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:46.883752 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:47.369539 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:20:47.383080 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:47.383149 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:47.383440 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:47.383498 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:47.426348 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:20:47.429646 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:47.429686 1701291 retry.go:31] will retry after 22.399494886s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:47.883262 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:47.883365 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:47.883699 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:48.383121 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:48.383192 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:48.383450 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:48.883172 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:48.883246 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:48.883565 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:49.383175 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:49.383254 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:49.383619 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:49.383673 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:49.883307 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:49.883381 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:49.883648 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:50.383167 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:50.383242 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:50.383602 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:50.883305 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:50.883403 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:50.883701 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:51.383597 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:51.383671 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:51.383949 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:51.383999 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:51.883799 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:51.883891 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:51.884215 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:52.383953 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:52.384046 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:52.384337 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:52.883622 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:52.883695 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:52.883974 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:53.383750 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:53.383840 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:53.384189 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:53.384246 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:53.883872 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:53.883946 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:53.884278 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:54.383691 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:54.383768 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:54.384062 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:54.883870 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:54.883951 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:54.884279 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:55.384078 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:55.384159 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:55.384531 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:55.384594 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:55.601942 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:20:55.661064 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:20:55.665031 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:20:55.665156 1701291 out.go:285] ! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1124 09:20:55.883471 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:55.883549 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:55.883839 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:56.384006 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:56.384085 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:56.384438 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:56.883159 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:56.883256 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:56.883546 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:57.383058 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:57.383127 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:57.383401 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:57.883132 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:57.883210 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:57.883522 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:57.883572 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:58.383147 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:58.383243 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:58.383538 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:58.883654 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:58.883729 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:58.883987 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:59.383805 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:59.383888 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:59.384179 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:59.883985 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:59.884058 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:59.884355 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:59.884403 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:00.383754 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:00.383833 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:00.384151 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:00.883935 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:00.884016 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:00.884352 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:01.383267 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:01.383344 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:01.383652 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:01.883147 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:01.883222 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:01.883556 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:02.383255 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:02.383332 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:02.383663 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:02.383721 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:02.883448 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:02.883530 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:02.883895 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:03.383623 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:03.383692 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:03.383959 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:03.883727 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:03.883833 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:03.884183 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:04.383989 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:04.384068 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:04.384431 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:04.384491 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:04.883658 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:04.883737 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:04.884051 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:05.383792 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:05.383863 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:05.384221 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:05.883873 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:05.883951 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:05.884288 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:06.383270 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:06.383343 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:06.383618 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:06.883169 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:06.883268 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:06.883573 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:06.883620 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:07.383342 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:07.383427 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:07.383765 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:07.884027 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:07.884094 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:07.884425 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:08.383164 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:08.383239 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:08.383598 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:08.883308 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:08.883398 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:08.883741 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:08.883802 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:09.383098 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:09.383166 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:09.383423 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:09.830147 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:21:09.883707 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:09.883815 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:09.884234 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:09.887265 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:21:09.890761 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:21:09.890861 1701291 out.go:285] ! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1124 09:21:09.895836 1701291 out.go:179] * Enabled addons: 
	I1124 09:21:09.899594 1701291 addons.go:530] duration metric: took 1m43.548488453s for enable addons: enabled=[]
	I1124 09:21:10.383381 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:10.383468 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:10.383851 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:10.883541 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:10.883612 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:10.883871 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:10.883921 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:11.383721 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:11.383804 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:11.384146 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:11.883758 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:11.883832 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:11.884153 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:12.383650 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:12.383725 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:12.383994 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:12.883791 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:12.883869 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:12.884200 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:12.884259 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:13.384051 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:13.384130 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:13.384481 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:13.883069 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:13.883147 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:13.883443 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:14.383158 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:14.383256 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:14.383600 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:14.883308 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:14.883386 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:14.883743 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:15.383457 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:15.383524 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:15.383790 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:15.383833 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:15.883160 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:15.883235 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:15.883570 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:16.383347 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:16.383428 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:16.383759 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:16.883325 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:16.883399 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:16.883664 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:17.383191 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:17.383290 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:17.383661 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:17.883228 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:17.883306 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:17.883672 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:17.883730 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:18.383978 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:18.384061 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:18.384373 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:18.883112 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:18.883209 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:18.883544 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:19.383112 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:19.383198 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:19.383544 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:19.883660 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:19.883735 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:19.883994 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:19.884034 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:20.383840 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:20.383939 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:20.384276 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:20.884063 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:20.884139 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:20.884609 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:21.383122 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:21.383191 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:21.383474 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:21.883229 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:21.883311 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:21.883643 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:22.383182 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:22.383259 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:22.383610 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:22.383663 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:22.884007 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:22.884077 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:22.884343 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:23.384148 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:23.384238 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:23.384581 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:23.883132 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:23.883207 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:23.883554 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:24.383088 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:24.383159 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:24.383481 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:24.883179 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:24.883275 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:24.883610 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:24.883675 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:25.383186 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:25.383268 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:25.383608 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:25.883472 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:25.883586 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:25.884129 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:26.383146 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:26.383230 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:26.383577 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:26.883188 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:26.883299 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:26.883678 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:26.883758 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:27.384111 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:27.384181 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:27.384491 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:27.883092 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:27.883171 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:27.883515 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:28.383158 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:28.383240 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:28.383623 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:28.883309 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:28.883385 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:28.883717 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:29.383191 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:29.383266 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:29.383650 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:29.383702 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:29.883188 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:29.883275 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:29.883613 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:30.383126 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:30.383193 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:30.383490 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:30.883171 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:30.883254 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:30.883605 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:31.383171 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:31.383250 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:31.383601 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:31.883122 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:31.883191 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:31.883449 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:31.883489 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:32.383217 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:32.383291 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:32.383620 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:32.883183 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:32.883260 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:32.883629 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:33.383180 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:33.383262 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:33.383549 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:33.883190 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:33.883271 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:33.883638 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:33.883694 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:34.383256 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:34.383337 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:34.383680 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:34.883132 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:34.883214 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:34.883526 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:35.383172 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:35.383252 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:35.383615 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:35.883199 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:35.883282 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:35.883582 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:36.383281 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:36.383351 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:36.383609 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:36.383650 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:36.883304 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:36.883386 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:36.883706 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:37.383434 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:37.383512 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:37.383858 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:37.883555 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:37.883635 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:37.883920 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:38.383713 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:38.383800 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:38.384150 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:38.384211 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:38.884005 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:38.884085 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:38.884432 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:39.383117 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:39.383189 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:39.383470 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:39.883210 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:39.883320 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:39.883681 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:40.383192 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:40.383273 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:40.383648 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:40.883329 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:40.883413 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:40.883677 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:40.883719 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:41.383810 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:41.383891 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:41.384260 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:41.884110 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:41.884211 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:41.884610 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:42.383111 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:42.383184 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:42.383469 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:42.883146 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:42.883219 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:42.883556 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:43.383303 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:43.383390 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:43.383815 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:43.383880 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:43.884150 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:43.884225 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:43.884489 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:44.383187 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:44.383285 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:44.383631 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:44.883347 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:44.883424 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:44.883787 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:45.383143 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:45.383221 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:45.383485 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:45.883220 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:45.883291 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:45.883631 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:45.883683 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:46.383565 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:46.383643 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:46.384005 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:46.883681 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:46.883753 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:46.884095 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:47.383931 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:47.384032 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:47.384438 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:47.884098 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:47.884173 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:47.884475 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:47.884521 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:48.383141 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:48.383214 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:48.383504 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:48.883189 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:48.883295 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:48.883641 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:49.383237 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:49.383316 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:49.383651 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:49.883138 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:49.883214 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:49.883514 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:50.383163 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:50.383242 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:50.383592 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:50.383651 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:50.883194 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:50.883284 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:50.883599 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:51.383074 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:51.383155 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:51.383436 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:51.883141 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:51.883231 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:51.883582 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:52.383155 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:52.383242 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:52.383575 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:52.883252 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:52.883327 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:52.883595 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:52.883642 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:53.383321 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:53.383392 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:53.383737 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:53.883216 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:53.883293 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:53.883646 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:54.383339 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:54.383413 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:54.383688 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:54.883201 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:54.883274 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:54.883590 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:55.383288 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:55.383366 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:55.383704 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:55.383769 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:55.883435 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:55.883505 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:55.883816 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:56.384009 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:56.384088 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:56.384422 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:56.883137 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:56.883222 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:56.883558 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:57.383818 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:57.383897 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:57.384172 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:57.384212 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:57.883977 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:57.884053 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:57.884399 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:58.383153 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:58.383233 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:58.383556 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:58.883101 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:58.883177 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:58.883433 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:59.383157 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:59.383236 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:59.383565 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:59.883168 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:59.883258 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:59.883650 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:59.883705 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:00.383305 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:00.383386 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:00.383837 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:00.883166 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:00.883245 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:00.883577 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:01.383186 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:01.383267 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:01.383606 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:01.883853 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:01.883923 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:01.884206 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:01.884257 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:02.384016 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:02.384095 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:02.384455 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:02.884100 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:02.884181 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:02.884522 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:03.383131 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:03.383207 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:03.383521 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:03.883200 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:03.883287 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:03.883604 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:04.383190 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:04.383274 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:04.383643 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:04.383702 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:04.883207 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:04.883279 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:04.883551 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:05.383188 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:05.383272 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:05.383607 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:05.883177 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:05.883257 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:05.883627 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:06.383792 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:06.383879 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:06.384240 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:06.384291 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:06.884040 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:06.884120 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:06.884445 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:07.383154 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:07.383230 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:07.383566 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:07.883872 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:07.883944 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:07.884212 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:08.383967 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:08.384042 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:08.384363 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:08.384428 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:08.883105 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:08.883184 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:08.883520 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:09.383654 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:09.383727 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:09.384039 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:09.883710 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:09.883788 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:09.884141 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:10.383940 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:10.384022 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:10.384358 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:10.883643 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:10.883717 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:10.883979 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:10.884026 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:11.384043 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:11.384119 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:11.384476 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:11.884107 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:11.884182 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:11.884497 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:12.383080 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:12.383156 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:12.383420 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:12.883114 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:12.883197 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:12.883546 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:13.383148 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:13.383235 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:13.383567 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:13.383626 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:13.883128 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:13.883206 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:13.883519 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:14.383161 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:14.383237 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:14.383589 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:14.883306 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:14.883385 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:14.883735 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:15.384005 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:15.384082 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:15.384357 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:15.384407 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:15.883074 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:15.883147 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:15.883531 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:16.383351 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:16.383433 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:16.383810 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:16.883361 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:16.883437 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:16.883741 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:17.383154 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:17.383240 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:17.383580 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:17.883164 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:17.883241 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:17.883543 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:17.883590 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:18.383110 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:18.383195 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:18.383512 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:18.883210 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:18.883284 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:18.883632 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:19.383181 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:19.383265 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:19.383589 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:19.883083 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:19.883153 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:19.883418 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:20.383720 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:20.383806 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:20.384138 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:20.384189 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:20.883895 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:20.883977 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:20.884383 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:21.383110 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:21.383179 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:21.383449 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:21.883148 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:21.883222 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:21.883554 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:22.383181 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:22.383267 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:22.383745 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:22.883438 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:22.883512 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:22.883827 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:22.883878 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:23.383219 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:23.383300 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:23.383650 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:23.883352 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:23.883439 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:23.883739 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:24.383094 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:24.383172 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:24.383441 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:24.883170 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:24.883246 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:24.883573 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:25.383180 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:25.383258 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:25.383557 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:25.383602 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:25.883124 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:25.883200 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:25.883530 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:26.383418 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:26.383502 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:26.383820 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:26.883156 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:26.883232 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:26.883574 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:27.383253 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:27.383325 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:27.383640 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:27.383695 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:27.883229 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:27.883308 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:27.883663 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:28.383352 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:28.383428 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:28.383771 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:28.883152 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:28.883224 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:28.883533 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:29.383259 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:29.383346 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:29.383718 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:29.383781 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:29.883468 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:29.883551 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:29.883860 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:30.383098 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:30.383174 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:30.383431 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:30.883167 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:30.883258 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:30.883626 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:31.383185 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:31.383269 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:31.383600 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:31.883135 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:31.883200 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:31.883477 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:31.883524 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:32.383251 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:32.383334 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:32.383667 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:32.883186 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:32.883294 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:32.883590 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:33.383124 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:33.383196 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:33.383537 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:33.883238 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:33.883319 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:33.883668 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:33.883725 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:34.383411 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:34.383500 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:34.383842 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:34.883113 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:34.883201 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:34.883459 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:35.383178 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:35.383251 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:35.383573 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:35.883163 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:35.883245 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:35.883570 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:36.383743 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:36.383821 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:36.384077 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:36.384116 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:36.883871 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:36.883954 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:36.884285 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:37.384043 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:37.384116 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:37.384446 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:37.883126 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:37.883195 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:37.883464 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:38.383151 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:38.383233 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:38.383571 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:38.883272 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:38.883352 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:38.883655 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:38.883702 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:39.383349 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:39.383416 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:39.383686 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:39.883194 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:39.883268 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:39.883616 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:40.383355 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:40.383439 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:40.383825 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:40.883050 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:40.883119 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:40.883381 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:41.383178 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:41.383263 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:41.383602 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:41.383659 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:41.883334 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:41.883418 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:41.883737 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:42.383098 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:42.383164 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:42.383505 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:42.883181 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:42.883256 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:42.883607 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:43.383328 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:43.383407 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:43.383779 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:43.383848 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:43.883040 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:43.883108 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:43.883373 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:44.383062 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:44.383137 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:44.383488 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:44.883210 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:44.883294 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:44.883624 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:45.383177 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:45.383283 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:45.383549 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:45.883285 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:45.883371 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:45.883679 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:45.883730 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:46.383910 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:46.383988 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:46.384338 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:46.883634 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:46.883708 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:46.883972 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:47.383794 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:47.383890 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:47.384333 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:47.883084 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:47.883172 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:47.883512 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:48.383207 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:48.383278 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:48.383553 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:48.383599 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:48.883146 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:48.883219 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:48.883545 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:49.383190 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:49.383267 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:49.383618 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:49.883863 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:49.883935 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:49.884201 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:50.384017 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:50.384095 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:50.384461 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:50.384517 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:50.883189 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:50.883269 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:50.883636 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:51.383067 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:51.383134 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:51.383393 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:51.883095 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:51.883170 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:51.883486 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:52.383088 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:52.383168 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:52.383503 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:52.883649 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:52.883715 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:52.883972 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:52.884013 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:53.383510 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:53.383586 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:53.383942 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:53.883728 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:53.883810 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:53.884186 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:54.383720 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:54.383800 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:54.384075 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:54.883881 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:54.883959 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:54.884315 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:54.884374 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:55.383090 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:55.383174 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:55.383511 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:55.883189 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:55.883267 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:55.883536 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:56.383980 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:56.384072 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:56.384430 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:56.883181 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:56.883264 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:56.883592 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:57.383208 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:57.383283 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:57.383562 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:57.383632 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:57.883178 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:57.883262 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:57.883557 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:58.383270 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:58.383366 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:58.383681 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:58.883065 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:58.883142 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:58.883409 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:59.383139 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:59.383281 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:59.383599 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:59.883295 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:59.883368 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:59.883709 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:59.883767 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:00.383455 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:00.383535 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:00.383834 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:00.883728 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:00.883804 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:00.884143 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:01.383926 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:01.384011 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:01.384371 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:01.883658 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:01.883732 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:01.884049 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:01.884099 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:02.383854 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:02.383936 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:02.384276 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:02.883965 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:02.884044 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:02.884421 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:03.383112 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:03.383191 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:03.383460 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:03.883214 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:03.883310 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:03.883701 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:04.383439 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:04.383520 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:04.383845 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:04.383903 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:04.883132 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:04.883203 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:04.883548 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:05.383170 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:05.383248 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:05.383591 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:05.883308 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:05.883386 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:05.883685 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:06.383882 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:06.383959 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:06.384277 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:06.384350 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:06.884102 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:06.884178 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:06.884513 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:07.383085 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:07.383184 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:07.383543 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:07.883883 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:07.883956 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:07.884221 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:08.384050 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:08.384123 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:08.384452 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:08.384509 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:08.883183 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:08.883259 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:08.883584 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:09.383246 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:09.383318 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:09.383640 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:09.883219 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:09.883299 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:09.883648 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:10.383378 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:10.383458 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:10.383753 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:10.883317 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:10.883388 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:10.883680 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:10.883723 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:11.383702 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:11.383803 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:11.384131 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:11.883721 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:11.883799 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:11.884129 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:12.383663 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:12.383738 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:12.384067 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:12.883854 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:12.883940 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:12.884274 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:12.884334 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:13.384116 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:13.384195 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:13.384538 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:13.883793 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:13.883869 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:13.884135 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:14.383911 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:14.383994 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:14.384297 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:14.883958 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:14.884048 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:14.884401 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:14.884456 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:15.383622 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:15.383700 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:15.383974 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:15.883699 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:15.883778 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:15.884117 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:16.384146 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:16.384226 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:16.384578 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:16.883198 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:16.883271 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:16.883565 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:17.383190 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:17.383273 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:17.383627 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:17.383682 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:17.883355 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:17.883436 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:17.883756 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:18.383116 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:18.383185 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:18.383441 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:18.883189 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:18.883268 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:18.883596 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:19.383187 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:19.383269 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:19.383630 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:19.883313 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:19.883385 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:19.883674 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:19.883725 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:20.383174 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:20.383257 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:20.383614 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:20.883340 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:20.883415 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:20.883771 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:21.383646 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:21.383720 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:21.383985 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:21.883845 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:21.883928 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:21.884313 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:21.884368 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:22.383054 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:22.383138 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:22.383471 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:22.883086 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:22.883170 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:22.883470 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:23.383158 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:23.383233 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:23.383562 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:23.883247 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:23.883321 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:23.883637 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:24.383095 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:24.383165 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:24.383431 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:24.383471 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:24.883124 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:24.883205 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:24.883534 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:25.383173 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:25.383251 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:25.383575 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:25.883126 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:25.883196 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:25.883470 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:26.383072 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:26.383157 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:26.383507 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:26.383568 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:26.883510 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:26.883587 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:26.883957 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:27.383649 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:27.383724 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:27.384102 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:27.883942 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:27.884029 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:27.884418 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:28.383174 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:28.383254 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:28.383611 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:28.383665 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:28.883114 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:28.883191 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:28.883456 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:29.383167 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:29.383248 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:29.383597 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:29.883182 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:29.883258 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:29.883579 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:30.383122 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:30.383199 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:30.383527 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:30.883137 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:30.883215 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:30.883514 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:30.883561 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:31.383185 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:31.383260 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:31.383609 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:31.883900 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:31.883970 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:31.884278 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:32.384086 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:32.384160 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:32.384455 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:32.883182 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:32.883259 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:32.883600 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:32.883657 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:33.383111 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:33.383190 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:33.383455 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:33.883174 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:33.883260 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:33.883641 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:34.383362 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:34.383442 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:34.383802 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:34.883106 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:34.883183 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:34.883439 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:35.383143 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:35.383220 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:35.383551 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:35.383604 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:35.883151 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:35.883229 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:35.883562 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:36.383293 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:36.383366 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:36.383619 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:36.883173 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:36.883255 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:36.883580 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:37.383161 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:37.383237 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:37.383584 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:37.383635 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:37.883107 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:37.883182 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:37.883504 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:38.383163 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:38.383248 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:38.383593 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:38.883178 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:38.883257 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:38.883615 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:39.383868 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:39.383940 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:39.384210 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:39.384251 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:39.883999 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:39.884075 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:39.884422 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:40.383152 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:40.383231 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:40.383560 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:40.883278 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:40.883355 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:40.883619 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:41.383462 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:41.383549 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:41.383883 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:41.883473 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:41.883550 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:41.883893 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:41.883952 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:42.383654 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:42.383728 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:42.384013 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:42.883799 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:42.883875 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:42.884236 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:43.384072 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:43.384157 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:43.384486 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:43.883149 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:43.883222 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:43.883524 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:44.383233 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:44.383315 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:44.383652 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:44.383713 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:44.883155 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:44.883243 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:44.883579 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:45.383126 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:45.383203 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:45.383524 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:45.883188 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:45.883264 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:45.883628 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:46.383346 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:46.383429 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:46.383765 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:46.383819 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:46.883878 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:46.883951 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:46.884224 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:47.384060 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:47.384136 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:47.384469 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:47.883170 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:47.883249 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:47.883589 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:48.383128 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:48.383211 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:48.383474 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:48.883184 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:48.883264 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:48.883602 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:48.883663 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:49.383164 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:49.383241 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:49.383569 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:49.883290 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:49.883367 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:49.883671 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:50.383185 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:50.383268 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:50.383648 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:50.883431 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:50.883514 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:50.883850 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:50.883909 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:51.383652 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:51.383720 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:51.383978 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:51.883444 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:51.883523 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:51.883866 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:52.383586 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:52.383680 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:52.384026 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:52.883655 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:52.883728 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:52.884053 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:52.884105 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:53.383855 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:53.383945 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:53.384271 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:53.884101 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:53.884186 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:53.884529 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:54.383101 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:54.383176 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:54.383443 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:54.883148 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:54.883222 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:54.883575 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:55.383191 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:55.383270 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:55.383608 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:55.383664 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:55.883870 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:55.883946 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:55.884289 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:56.383256 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:56.383351 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:56.383747 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:56.883462 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:56.883538 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:56.883871 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:57.383563 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:57.383638 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:57.383899 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:57.383944 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:57.883683 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:57.883768 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:57.884147 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:58.383932 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:58.384008 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:58.384395 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:58.883091 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:58.883159 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:58.883412 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:59.383084 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:59.383166 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:59.383498 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:59.883178 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:59.883259 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:59.883595 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:59.883655 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:00.392124 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:00.392210 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:00.392556 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:00.883180 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:00.883282 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:00.883653 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:01.383190 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:01.383267 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:01.383567 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:01.883245 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:01.883313 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:01.883583 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:02.383189 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:02.383267 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:02.383605 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:02.383667 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:02.883233 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:02.883317 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:02.883620 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:03.383856 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:03.383927 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:03.384185 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:03.884056 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:03.884135 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:03.884494 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:04.383223 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:04.383311 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:04.383613 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:04.883276 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:04.883345 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:04.883599 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:04.883643 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:05.383163 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:05.383239 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:05.383541 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:05.883213 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:05.883295 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:05.883634 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:06.383304 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:06.383375 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:06.383679 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:06.883401 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:06.883483 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:06.883806 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:06.883865 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:07.383190 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:07.383271 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:07.383648 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:07.883325 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:07.883398 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:07.883710 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:08.383188 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:08.383266 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:08.383566 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:08.883267 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:08.883345 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:08.883690 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:09.384042 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:09.384118 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:09.384458 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:09.384510 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:09.883180 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:09.883253 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:09.883573 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:10.383180 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:10.383261 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:10.383583 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:10.883127 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:10.883204 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:10.883474 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:11.383167 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:11.383240 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:11.383552 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:11.883157 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:11.883234 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:11.883563 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:11.883618 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:12.383084 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:12.383153 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:12.383411 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:12.883181 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:12.883256 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:12.883591 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:13.383188 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:13.383265 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:13.383586 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:13.883123 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:13.883204 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:13.883485 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:14.383559 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:14.383638 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:14.383953 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:14.384012 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:14.883792 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:14.883868 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:14.884213 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:15.383592 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:15.383667 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:15.383925 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:15.883767 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:15.883843 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:15.884202 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:16.383348 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:16.383420 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:16.383758 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:16.883457 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:16.883538 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:16.883795 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:16.883837 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:17.383161 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:17.383238 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:17.383573 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:17.883186 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:17.883271 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:17.883611 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:18.383302 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:18.383377 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:18.383637 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:18.883191 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:18.883269 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:18.883626 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:19.383147 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:19.383224 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:19.383554 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:19.383616 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:19.883116 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:19.883185 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:19.883449 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:20.383135 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:20.383213 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:20.383531 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:20.883180 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:20.883260 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:20.883559 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:21.383124 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:21.383200 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:21.383460 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:21.883132 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:21.883213 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:21.883553 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:21.883608 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:22.383153 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:22.383241 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:22.383543 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:22.883110 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:22.883179 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:22.883439 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:23.383165 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:23.383246 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:23.383640 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:23.883372 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:23.883448 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:23.883789 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:23.883846 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:24.383112 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:24.383188 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:24.383507 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:24.883200 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:24.883275 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:24.883561 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:25.383261 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:25.383336 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:25.383674 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:25.883358 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:25.883437 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:25.883749 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:26.383290 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:26.383368 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:26.383724 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:26.383783 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:26.883478 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:26.883555 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:26.883888 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:27.383604 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:27.383677 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:27.383939 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:27.883757 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:27.883845 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:27.884167 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:28.383852 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:28.383928 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:28.384269 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:28.384325 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:28.883626 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:28.883692 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:28.883958 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:29.383717 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:29.383796 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:29.384139 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:29.883960 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:29.884036 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:29.884369 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:30.383625 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:30.383694 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:30.383980 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:30.883744 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:30.883816 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:30.884150 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:30.884205 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:31.383977 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:31.384060 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:31.384393 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:31.883624 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:31.883716 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:31.883977 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:32.383741 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:32.383814 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:32.384155 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:32.883968 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:32.884055 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:32.884386 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:32.884443 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:33.383735 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:33.383805 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:33.384072 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:33.883915 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:33.883991 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:33.884369 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:34.383120 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:34.383204 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:34.383566 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:34.883846 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:34.883924 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:34.884224 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:35.383982 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:35.384056 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:35.384427 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:35.384483 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:35.884112 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:35.884192 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:35.884530 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:36.383425 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:36.383499 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:36.383766 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:36.883482 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:36.883565 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:36.883947 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:37.383742 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:37.383819 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:37.384158 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:37.883707 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:37.883774 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:37.884034 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:37.884074 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:38.383850 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:38.383958 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:38.384324 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:38.883075 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:38.883152 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:38.883501 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:39.383112 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:39.383186 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:39.383448 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:39.883223 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:39.883319 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:39.883638 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:40.383373 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:40.383445 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:40.383734 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:40.383787 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:40.883050 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:40.883126 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:40.883428 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:41.383217 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:41.383293 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:41.383634 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:41.883211 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:41.883294 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:41.883578 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:42.383069 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:42.383136 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:42.383390 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:42.883177 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:42.883268 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:42.883564 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:42.883610 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:43.383316 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:43.383402 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:43.383752 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:43.884076 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:43.884150 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:43.884466 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:44.383187 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:44.383282 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:44.383645 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:44.883389 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:44.883464 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:44.883804 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:44.883864 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:45.383117 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:45.383195 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:45.383502 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:45.883172 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:45.883255 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:45.883601 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:46.383362 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:46.383444 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:46.383798 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:46.883132 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:46.883204 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:46.883525 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:47.383245 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:47.383343 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:47.383724 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:47.383787 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:47.883322 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:47.883396 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:47.883705 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:48.383414 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:48.383490 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:48.383778 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:48.883192 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:48.883270 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:48.883613 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:49.383457 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:49.383533 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:49.383864 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:49.383922 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:49.883061 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:49.883134 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:49.883396 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:50.383133 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:50.383215 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:50.383592 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:50.883345 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:50.883424 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:50.883767 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:51.383609 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:51.383687 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:51.383946 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:51.383994 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:51.883714 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:51.883789 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:51.884128 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:52.383943 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:52.384028 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:52.384399 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:52.883710 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:52.883786 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:52.884049 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:53.383826 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:53.383902 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:53.384299 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:53.384353 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:53.883075 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:53.883154 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:53.883549 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:54.383241 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:54.383316 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:54.383579 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:54.883201 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:54.883281 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:54.883627 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:55.383208 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:55.383284 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:55.383613 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:55.883300 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:55.883372 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:55.883626 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:55.883666 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:56.383898 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:56.383987 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:56.384342 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:56.883076 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:56.883152 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:56.883529 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:57.383840 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:57.383919 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:57.384396 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:57.883127 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:57.883220 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:57.883601 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:58.383185 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:58.383263 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:58.383601 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:58.383658 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:58.883103 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:58.883174 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:58.883430 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:59.383161 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:59.383239 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:59.383528 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:59.883235 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:59.883319 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:59.883655 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:00.383085 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:00.383175 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:00.383480 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:00.883287 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:00.883398 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:00.883768 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:00.883828 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:01.383727 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:01.383809 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:01.384138 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:01.883728 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:01.883797 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:01.884120 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:02.383919 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:02.383991 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:02.384291 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:02.884049 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:02.884120 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:02.884420 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:02.884485 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:03.383819 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:03.383888 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:03.384209 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:03.884004 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:03.884091 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:03.884451 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:04.384095 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:04.384179 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:04.384501 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:04.883234 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:04.883303 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:04.883584 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:05.383144 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:05.383220 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:05.383542 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:05.383601 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:05.883200 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:05.883285 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:05.883658 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:06.383258 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:06.383334 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:06.383660 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:06.883258 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:06.883336 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:06.883680 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:07.383404 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:07.383485 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:07.383858 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:07.383913 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:07.883619 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:07.883699 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:07.883964 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:08.383741 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:08.383818 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:08.384168 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:08.883989 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:08.884068 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:08.884393 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:09.384092 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:09.384163 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:09.384427 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:09.384467 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:09.883175 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:09.883251 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:09.883564 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:10.383182 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:10.383261 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:10.383593 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:10.883126 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:10.883201 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:10.883461 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:11.383179 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:11.383273 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:11.383596 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:11.883182 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:11.883257 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:11.883609 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:11.883665 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:12.383318 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:12.383399 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:12.383715 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:12.883177 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:12.883251 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:12.883592 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:13.383299 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:13.383377 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:13.383726 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:13.883393 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:13.883461 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:13.883721 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:13.883763 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:14.383180 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:14.383258 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:14.383605 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:14.883184 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:14.883272 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:14.883660 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:15.383351 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:15.383434 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:15.383700 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:15.883201 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:15.883305 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:15.883711 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:16.383360 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:16.383441 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:16.383809 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:16.383867 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:16.883068 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:16.883136 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:16.883406 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:17.383093 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:17.383175 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:17.383513 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:17.883239 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:17.883322 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:17.883695 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:18.383395 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:18.383464 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:18.383742 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:18.883187 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:18.883264 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:18.883645 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:18.883700 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:19.383197 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:19.383275 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:19.383625 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:19.883335 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:19.883406 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:19.883791 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:20.383200 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:20.383305 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:20.383704 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:20.883416 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:20.883493 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:20.883891 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:20.883947 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:21.383661 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:21.383731 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:21.383987 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:21.883737 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:21.883815 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:21.884385 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:22.383106 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:22.383198 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:22.383565 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:22.883149 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:22.883216 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:22.883512 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:23.383192 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:23.383276 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:23.383626 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:23.383680 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:23.883354 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:23.883454 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:23.883802 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:24.383131 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:24.383205 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:24.383521 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:24.883214 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:24.883298 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:24.883675 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:25.383244 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:25.383317 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:25.383636 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:25.883064 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:25.883143 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:25.883420 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:25.883474 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:26.383164 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:26.383254 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:26.383617 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:26.883333 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:26.883410 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:26.883740 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:27.383124 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:27.383182 1701291 node_ready.go:38] duration metric: took 6m0.000242478s for node "functional-291288" to be "Ready" ...
	I1124 09:25:27.386338 1701291 out.go:203] 
	W1124 09:25:27.389204 1701291 out.go:285] X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	W1124 09:25:27.389224 1701291 out.go:285] * 
	W1124 09:25:27.391374 1701291 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1124 09:25:27.394404 1701291 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Nov 24 09:25:34 functional-291288 containerd[5880]: time="2025-11-24T09:25:34.653424959Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.3\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:25:35 functional-291288 containerd[5880]: time="2025-11-24T09:25:35.680892289Z" level=info msg="No images store for sha256:a1f83055284ec302ac691d8677946d8b4e772fb7071d39ada1cc9184cb70814b"
	Nov 24 09:25:35 functional-291288 containerd[5880]: time="2025-11-24T09:25:35.683044385Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:latest\""
	Nov 24 09:25:35 functional-291288 containerd[5880]: time="2025-11-24T09:25:35.689835871Z" level=info msg="ImageCreate event name:\"sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:25:35 functional-291288 containerd[5880]: time="2025-11-24T09:25:35.690177864Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:latest\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:25:36 functional-291288 containerd[5880]: time="2025-11-24T09:25:36.659460561Z" level=info msg="No images store for sha256:f22c82f26787d6279eede4444a6bf746ad608345b849f71f03d60b8e0589531e"
	Nov 24 09:25:36 functional-291288 containerd[5880]: time="2025-11-24T09:25:36.661620279Z" level=info msg="ImageCreate event name:\"docker.io/library/minikube-local-cache-test:functional-291288\""
	Nov 24 09:25:36 functional-291288 containerd[5880]: time="2025-11-24T09:25:36.668726508Z" level=info msg="ImageCreate event name:\"sha256:cee23f3226e286bed41a2ff2b5fad8e6e395a2934880a19d2a902b07140a4221\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:25:36 functional-291288 containerd[5880]: time="2025-11-24T09:25:36.669034770Z" level=info msg="ImageUpdate event name:\"docker.io/library/minikube-local-cache-test:functional-291288\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:25:37 functional-291288 containerd[5880]: time="2025-11-24T09:25:37.473140767Z" level=info msg="RemoveImage \"registry.k8s.io/pause:latest\""
	Nov 24 09:25:37 functional-291288 containerd[5880]: time="2025-11-24T09:25:37.475635774Z" level=info msg="ImageDelete event name:\"registry.k8s.io/pause:latest\""
	Nov 24 09:25:37 functional-291288 containerd[5880]: time="2025-11-24T09:25:37.477631216Z" level=info msg="ImageDelete event name:\"sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a\""
	Nov 24 09:25:37 functional-291288 containerd[5880]: time="2025-11-24T09:25:37.491716240Z" level=info msg="RemoveImage \"registry.k8s.io/pause:latest\" returns successfully"
	Nov 24 09:25:38 functional-291288 containerd[5880]: time="2025-11-24T09:25:38.419124966Z" level=info msg="RemoveImage \"registry.k8s.io/pause:3.1\""
	Nov 24 09:25:38 functional-291288 containerd[5880]: time="2025-11-24T09:25:38.421484178Z" level=info msg="ImageDelete event name:\"registry.k8s.io/pause:3.1\""
	Nov 24 09:25:38 functional-291288 containerd[5880]: time="2025-11-24T09:25:38.423519611Z" level=info msg="ImageDelete event name:\"sha256:8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5\""
	Nov 24 09:25:38 functional-291288 containerd[5880]: time="2025-11-24T09:25:38.431745023Z" level=info msg="RemoveImage \"registry.k8s.io/pause:3.1\" returns successfully"
	Nov 24 09:25:38 functional-291288 containerd[5880]: time="2025-11-24T09:25:38.591084144Z" level=info msg="No images store for sha256:a1f83055284ec302ac691d8677946d8b4e772fb7071d39ada1cc9184cb70814b"
	Nov 24 09:25:38 functional-291288 containerd[5880]: time="2025-11-24T09:25:38.593237060Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:latest\""
	Nov 24 09:25:38 functional-291288 containerd[5880]: time="2025-11-24T09:25:38.600078138Z" level=info msg="ImageCreate event name:\"sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:25:38 functional-291288 containerd[5880]: time="2025-11-24T09:25:38.601791057Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:latest\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:25:38 functional-291288 containerd[5880]: time="2025-11-24T09:25:38.715557551Z" level=info msg="No images store for sha256:3ac89611d5efd8eb74174b1f04c33b7e73b651cec35b5498caf0cfdd2efd7d48"
	Nov 24 09:25:38 functional-291288 containerd[5880]: time="2025-11-24T09:25:38.717850948Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.1\""
	Nov 24 09:25:38 functional-291288 containerd[5880]: time="2025-11-24T09:25:38.724559939Z" level=info msg="ImageCreate event name:\"sha256:8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:25:38 functional-291288 containerd[5880]: time="2025-11-24T09:25:38.724882822Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.1\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:25:40.474235    9845 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:25:40.474833    9845 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:25:40.476509    9845 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:25:40.477011    9845 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:25:40.478563    9845 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Nov24 08:20] overlayfs: idmapped layers are currently not supported
	[Nov24 08:21] overlayfs: idmapped layers are currently not supported
	[Nov24 08:22] overlayfs: idmapped layers are currently not supported
	[Nov24 08:23] overlayfs: idmapped layers are currently not supported
	[Nov24 08:24] overlayfs: idmapped layers are currently not supported
	[ +19.566509] overlayfs: idmapped layers are currently not supported
	[Nov24 08:25] overlayfs: idmapped layers are currently not supported
	[ +35.934095] overlayfs: idmapped layers are currently not supported
	[Nov24 08:27] overlayfs: idmapped layers are currently not supported
	[Nov24 08:28] overlayfs: idmapped layers are currently not supported
	[Nov24 08:29] overlayfs: idmapped layers are currently not supported
	[Nov24 08:30] overlayfs: idmapped layers are currently not supported
	[Nov24 08:31] overlayfs: idmapped layers are currently not supported
	[ +44.505180] overlayfs: idmapped layers are currently not supported
	[Nov24 08:32] overlayfs: idmapped layers are currently not supported
	[Nov24 08:33] overlayfs: idmapped layers are currently not supported
	[Nov24 08:34] overlayfs: idmapped layers are currently not supported
	[ +21.137877] overlayfs: idmapped layers are currently not supported
	[ +28.724175] overlayfs: idmapped layers are currently not supported
	[Nov24 08:36] overlayfs: idmapped layers are currently not supported
	[Nov24 08:37] overlayfs: idmapped layers are currently not supported
	[Nov24 08:38] overlayfs: idmapped layers are currently not supported
	[  +4.063834] overlayfs: idmapped layers are currently not supported
	[Nov24 08:40] overlayfs: idmapped layers are currently not supported
	[Nov24 08:42] kauditd_printk_skb: 8 callbacks suppressed
	
	
	==> kernel <==
	 09:25:40 up  8:07,  0 user,  load average: 0.56, 0.31, 0.50
	Linux functional-291288 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Nov 24 09:25:37 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:25:37 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 824.
	Nov 24 09:25:37 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:25:37 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:25:37 functional-291288 kubelet[9611]: E1124 09:25:37.952470    9611 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:25:37 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:25:37 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:25:38 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 825.
	Nov 24 09:25:38 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:25:38 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:25:38 functional-291288 kubelet[9711]: E1124 09:25:38.685468    9711 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:25:38 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:25:38 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:25:39 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 826.
	Nov 24 09:25:39 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:25:39 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:25:39 functional-291288 kubelet[9739]: E1124 09:25:39.449261    9739 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:25:39 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:25:39 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:25:40 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 827.
	Nov 24 09:25:40 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:25:40 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:25:40 functional-291288 kubelet[9767]: E1124 09:25:40.191081    9767 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:25:40 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:25:40 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-291288 -n functional-291288
helpers_test.go:262: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-291288 -n functional-291288: exit status 2 (350.534118ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:262: status error: exit status 2 (may be ok)
helpers_test.go:264: "functional-291288" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmd (2.29s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmdDirectly (2.43s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmdDirectly
functional_test.go:756: (dbg) Run:  out/kubectl --context functional-291288 get pods
functional_test.go:756: (dbg) Non-zero exit: out/kubectl --context functional-291288 get pods: exit status 1 (107.384251ms)

                                                
                                                
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:759: failed to run kubectl directly. args "out/kubectl --context functional-291288 get pods": exit status 1
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmdDirectly]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmdDirectly]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect functional-291288
helpers_test.go:243: (dbg) docker inspect functional-291288:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52",
	        "Created": "2025-11-24T09:10:51.896020191Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 1695240,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-11-24T09:10:51.968983407Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:572c983e466f1f784136812eef5cc59ac623db764bc7704d3676c4643993fd08",
	        "ResolvConfPath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/hostname",
	        "HostsPath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/hosts",
	        "LogPath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52-json.log",
	        "Name": "/functional-291288",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "functional-291288:/var",
	                "/lib/modules:/lib/modules:ro"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-291288",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52",
	                "LowerDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7-init/diff:/var/lib/docker/overlay2/bf294786c7ade59cb761c7bdcc27045362c3779b00a569b685443f34ade17737/diff",
	                "MergedDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7/merged",
	                "UpperDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7/diff",
	                "WorkDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "volume",
	                "Name": "functional-291288",
	                "Source": "/var/lib/docker/volumes/functional-291288/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            },
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-291288",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-291288",
	                "name.minikube.sigs.k8s.io": "functional-291288",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "09c1c2eef0dca6362dde63b4cbc372c0cfa3e4fd084b8745043d8b88925691bf",
	            "SandboxKey": "/var/run/docker/netns/09c1c2eef0dc",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34684"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34685"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34688"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34686"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34687"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-291288": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "7e:49:22:0b:f9:2c",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "1e8f91e8ad9f46b831bbb1b0589b0022d940ee9875e64a648dc80612f3ca93dc",
	                    "EndpointID": "5de5ca8ccb07584b21e6e4e30dba12e0233e8d28c3e48e705cddffe75263b337",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-291288",
	                        "70848be15fcc"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-291288 -n functional-291288
helpers_test.go:247: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-291288 -n functional-291288: exit status 2 (336.161051ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:247: status error: exit status 2 (may be ok)
helpers_test.go:252: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmdDirectly FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmdDirectly]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 logs -n 25
helpers_test.go:255: (dbg) Done: out/minikube-linux-arm64 -p functional-291288 logs -n 25: (1.009065829s)
helpers_test.go:260: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmdDirectly logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                          ARGS                                                                           │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ image   │ functional-941011 image ls --format short --alsologtostderr                                                                                             │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ image   │ functional-941011 image ls --format yaml --alsologtostderr                                                                                              │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ ssh     │ functional-941011 ssh pgrep buildkitd                                                                                                                   │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │                     │
	│ image   │ functional-941011 image build -t localhost/my-image:functional-941011 testdata/build --alsologtostderr                                                  │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ image   │ functional-941011 image ls                                                                                                                              │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ image   │ functional-941011 image ls --format json --alsologtostderr                                                                                              │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ image   │ functional-941011 image ls --format table --alsologtostderr                                                                                             │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ delete  │ -p functional-941011                                                                                                                                    │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:10 UTC │ 24 Nov 25 09:10 UTC │
	│ start   │ -p functional-291288 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0 │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:10 UTC │                     │
	│ start   │ -p functional-291288 --alsologtostderr -v=8                                                                                                             │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:19 UTC │                     │
	│ cache   │ functional-291288 cache add registry.k8s.io/pause:3.1                                                                                                   │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ cache   │ functional-291288 cache add registry.k8s.io/pause:3.3                                                                                                   │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ cache   │ functional-291288 cache add registry.k8s.io/pause:latest                                                                                                │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ cache   │ functional-291288 cache add minikube-local-cache-test:functional-291288                                                                                 │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ cache   │ functional-291288 cache delete minikube-local-cache-test:functional-291288                                                                              │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.3                                                                                                                        │ minikube          │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ cache   │ list                                                                                                                                                    │ minikube          │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ ssh     │ functional-291288 ssh sudo crictl images                                                                                                                │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ ssh     │ functional-291288 ssh sudo crictl rmi registry.k8s.io/pause:latest                                                                                      │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ ssh     │ functional-291288 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                                 │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │                     │
	│ cache   │ functional-291288 cache reload                                                                                                                          │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ ssh     │ functional-291288 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                                 │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.1                                                                                                                        │ minikube          │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ cache   │ delete registry.k8s.io/pause:latest                                                                                                                     │ minikube          │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ kubectl │ functional-291288 kubectl -- --context functional-291288 get pods                                                                                       │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │                     │
	└─────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/11/24 09:19:20
	Running on machine: ip-172-31-30-239
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1124 09:19:20.929895 1701291 out.go:360] Setting OutFile to fd 1 ...
	I1124 09:19:20.930102 1701291 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:19:20.930128 1701291 out.go:374] Setting ErrFile to fd 2...
	I1124 09:19:20.930149 1701291 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:19:20.930488 1701291 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
	I1124 09:19:20.930883 1701291 out.go:368] Setting JSON to false
	I1124 09:19:20.931751 1701291 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":28890,"bootTime":1763947071,"procs":158,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1124 09:19:20.931843 1701291 start.go:143] virtualization:  
	I1124 09:19:20.938521 1701291 out.go:179] * [functional-291288] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1124 09:19:20.941571 1701291 out.go:179]   - MINIKUBE_LOCATION=21978
	I1124 09:19:20.941660 1701291 notify.go:221] Checking for updates...
	I1124 09:19:20.947508 1701291 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1124 09:19:20.950282 1701291 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 09:19:20.953189 1701291 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21978-1652607/.minikube
	I1124 09:19:20.956068 1701291 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1124 09:19:20.958991 1701291 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1124 09:19:20.962273 1701291 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1124 09:19:20.962433 1701291 driver.go:422] Setting default libvirt URI to qemu:///system
	I1124 09:19:20.992476 1701291 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1124 09:19:20.992586 1701291 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 09:19:21.057666 1701291 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-11-24 09:19:21.047762616 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 09:19:21.057787 1701291 docker.go:319] overlay module found
	I1124 09:19:21.060830 1701291 out.go:179] * Using the docker driver based on existing profile
	I1124 09:19:21.063549 1701291 start.go:309] selected driver: docker
	I1124 09:19:21.063567 1701291 start.go:927] validating driver "docker" against &{Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disa
bleCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:19:21.063661 1701291 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1124 09:19:21.063775 1701291 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 09:19:21.121254 1701291 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-11-24 09:19:21.111151392 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 09:19:21.121789 1701291 cni.go:84] Creating CNI manager for ""
	I1124 09:19:21.121863 1701291 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1124 09:19:21.121942 1701291 start.go:353] cluster config:
	{Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local C
ontainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPa
th: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:19:21.125134 1701291 out.go:179] * Starting "functional-291288" primary control-plane node in "functional-291288" cluster
	I1124 09:19:21.127989 1701291 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1124 09:19:21.131005 1701291 out.go:179] * Pulling base image v0.0.48-1763789673-21948 ...
	I1124 09:19:21.133917 1701291 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f in local docker daemon
	I1124 09:19:21.133914 1701291 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1124 09:19:21.154192 1701291 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f in local docker daemon, skipping pull
	I1124 09:19:21.154216 1701291 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f exists in daemon, skipping load
	W1124 09:19:21.197477 1701291 preload.go:144] https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.35.0-beta.0/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	W1124 09:19:21.391690 1701291 preload.go:144] https://github.com/kubernetes-sigs/minikube-preloads/releases/download/v18/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	I1124 09:19:21.391947 1701291 profile.go:143] Saving config to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/config.json ...
	I1124 09:19:21.392070 1701291 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:19:21.392253 1701291 cache.go:243] Successfully downloaded all kic artifacts
	I1124 09:19:21.392304 1701291 start.go:360] acquireMachinesLock for functional-291288: {Name:mk85384dc057570e1f34db593d357cea738652c4 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.392403 1701291 start.go:364] duration metric: took 38.802µs to acquireMachinesLock for "functional-291288"
	I1124 09:19:21.392443 1701291 start.go:96] Skipping create...Using existing machine configuration
	I1124 09:19:21.392463 1701291 fix.go:54] fixHost starting: 
	I1124 09:19:21.392780 1701291 cli_runner.go:164] Run: docker container inspect functional-291288 --format={{.State.Status}}
	I1124 09:19:21.413220 1701291 fix.go:112] recreateIfNeeded on functional-291288: state=Running err=<nil>
	W1124 09:19:21.413254 1701291 fix.go:138] unexpected machine state, will restart: <nil>
	I1124 09:19:21.416439 1701291 out.go:252] * Updating the running docker "functional-291288" container ...
	I1124 09:19:21.416481 1701291 machine.go:94] provisionDockerMachine start ...
	I1124 09:19:21.416565 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:21.444143 1701291 main.go:143] libmachine: Using SSH client type: native
	I1124 09:19:21.444471 1701291 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34684 <nil> <nil>}
	I1124 09:19:21.444480 1701291 main.go:143] libmachine: About to run SSH command:
	hostname
	I1124 09:19:21.581815 1701291 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:19:21.598566 1701291 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-291288
	
	I1124 09:19:21.598592 1701291 ubuntu.go:182] provisioning hostname "functional-291288"
	I1124 09:19:21.598669 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:21.623443 1701291 main.go:143] libmachine: Using SSH client type: native
	I1124 09:19:21.623759 1701291 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34684 <nil> <nil>}
	I1124 09:19:21.623771 1701291 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-291288 && echo "functional-291288" | sudo tee /etc/hostname
	I1124 09:19:21.758572 1701291 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:19:21.799121 1701291 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-291288
	
	I1124 09:19:21.799200 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:21.831127 1701291 main.go:143] libmachine: Using SSH client type: native
	I1124 09:19:21.831435 1701291 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34684 <nil> <nil>}
	I1124 09:19:21.831451 1701291 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-291288' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-291288/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-291288' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1124 09:19:21.919264 1701291 cache.go:107] acquiring lock: {Name:mk22a10f0ce1f3295b61e7e76c455d0494a3e278 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.919300 1701291 cache.go:107] acquiring lock: {Name:mk1cf42e67442503a46c578224bd3cb68bf682d4 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.919365 1701291 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 exists
	I1124 09:19:21.919369 1701291 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 exists
	I1124 09:19:21.919375 1701291 cache.go:96] cache image "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0" took 74.126µs
	I1124 09:19:21.919377 1701291 cache.go:96] cache image "gcr.io/k8s-minikube/storage-provisioner:v5" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5" took 130.274µs
	I1124 09:19:21.919383 1701291 cache.go:80] save to tar file registry.k8s.io/kube-apiserver:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 succeeded
	I1124 09:19:21.919385 1701291 cache.go:80] save to tar file gcr.io/k8s-minikube/storage-provisioner:v5 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 succeeded
	I1124 09:19:21.919395 1701291 cache.go:107] acquiring lock: {Name:mkfdc49c8e68aee34cee0c9d441ae8a4dca675c6 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.919407 1701291 cache.go:107] acquiring lock: {Name:mk85f1502dbb97830776608fb729eb3605e112e6 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.919449 1701291 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 exists
	I1124 09:19:21.919433 1701291 cache.go:107] acquiring lock: {Name:mkdbf38e05e2c47c1a7a906a2236e9e7020a94c5 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.919454 1701291 cache.go:96] cache image "registry.k8s.io/pause:3.10.1" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1" took 48.764µs
	I1124 09:19:21.919460 1701291 cache.go:80] save to tar file registry.k8s.io/pause:3.10.1 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 succeeded
	I1124 09:19:21.919471 1701291 cache.go:107] acquiring lock: {Name:mk46ce3b59d7e062b3dbc8a90fe5b4231f256471 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.919266 1701291 cache.go:107] acquiring lock: {Name:mk80fdbe7cdb5bc17c2a82b4ecfd00214559a435 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.919495 1701291 cache.go:107] acquiring lock: {Name:mk726502cb84c177b2e14fee88512325761511c5 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:19:21.919506 1701291 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 exists
	I1124 09:19:21.919511 1701291 cache.go:96] cache image "registry.k8s.io/kube-proxy:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0" took 262.074µs
	I1124 09:19:21.919517 1701291 cache.go:80] save to tar file registry.k8s.io/kube-proxy:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 succeeded
	I1124 09:19:21.919425 1701291 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 exists
	I1124 09:19:21.919525 1701291 cache.go:96] cache image "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0" took 132.661µs
	I1124 09:19:21.919532 1701291 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 exists
	I1124 09:19:21.919476 1701291 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 exists
	I1124 09:19:21.919540 1701291 cache.go:96] cache image "registry.k8s.io/coredns/coredns:v1.13.1" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1" took 48.796µs
	I1124 09:19:21.919547 1701291 cache.go:80] save to tar file registry.k8s.io/coredns/coredns:v1.13.1 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 succeeded
	I1124 09:19:21.919541 1701291 cache.go:96] cache image "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0" took 109.4µs
	I1124 09:19:21.919553 1701291 cache.go:80] save to tar file registry.k8s.io/kube-scheduler:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 succeeded
	I1124 09:19:21.919533 1701291 cache.go:80] save to tar file registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 succeeded
	I1124 09:19:21.919557 1701291 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0 exists
	I1124 09:19:21.919563 1701291 cache.go:96] cache image "registry.k8s.io/etcd:3.5.24-0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0" took 93.482µs
	I1124 09:19:21.919568 1701291 cache.go:80] save to tar file registry.k8s.io/etcd:3.5.24-0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0 succeeded
	I1124 09:19:21.919582 1701291 cache.go:87] Successfully saved all images to host disk.
	I1124 09:19:21.982718 1701291 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1124 09:19:21.982799 1701291 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/21978-1652607/.minikube CaCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/21978-1652607/.minikube}
	I1124 09:19:21.982852 1701291 ubuntu.go:190] setting up certificates
	I1124 09:19:21.982880 1701291 provision.go:84] configureAuth start
	I1124 09:19:21.982954 1701291 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-291288
	I1124 09:19:22.001413 1701291 provision.go:143] copyHostCerts
	I1124 09:19:22.001464 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem
	I1124 09:19:22.001516 1701291 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem, removing ...
	I1124 09:19:22.001530 1701291 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem
	I1124 09:19:22.001614 1701291 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem (1078 bytes)
	I1124 09:19:22.001708 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem
	I1124 09:19:22.001726 1701291 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem, removing ...
	I1124 09:19:22.001731 1701291 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem
	I1124 09:19:22.001757 1701291 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem (1123 bytes)
	I1124 09:19:22.001795 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem
	I1124 09:19:22.001816 1701291 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem, removing ...
	I1124 09:19:22.001820 1701291 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem
	I1124 09:19:22.001845 1701291 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem (1679 bytes)
	I1124 09:19:22.001893 1701291 provision.go:117] generating server cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem org=jenkins.functional-291288 san=[127.0.0.1 192.168.49.2 functional-291288 localhost minikube]
	I1124 09:19:22.129571 1701291 provision.go:177] copyRemoteCerts
	I1124 09:19:22.129639 1701291 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1124 09:19:22.129681 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:22.147944 1701291 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:19:22.254207 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I1124 09:19:22.254271 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1124 09:19:22.271706 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem -> /etc/docker/server.pem
	I1124 09:19:22.271768 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1124 09:19:22.289262 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I1124 09:19:22.289325 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1124 09:19:22.306621 1701291 provision.go:87] duration metric: took 323.706379ms to configureAuth
	I1124 09:19:22.306647 1701291 ubuntu.go:206] setting minikube options for container-runtime
	I1124 09:19:22.306839 1701291 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1124 09:19:22.306847 1701291 machine.go:97] duration metric: took 890.360502ms to provisionDockerMachine
	I1124 09:19:22.306855 1701291 start.go:293] postStartSetup for "functional-291288" (driver="docker")
	I1124 09:19:22.306866 1701291 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1124 09:19:22.306912 1701291 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1124 09:19:22.306953 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:22.324012 1701291 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:19:22.434427 1701291 ssh_runner.go:195] Run: cat /etc/os-release
	I1124 09:19:22.437860 1701291 command_runner.go:130] > PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	I1124 09:19:22.437881 1701291 command_runner.go:130] > NAME="Debian GNU/Linux"
	I1124 09:19:22.437886 1701291 command_runner.go:130] > VERSION_ID="12"
	I1124 09:19:22.437890 1701291 command_runner.go:130] > VERSION="12 (bookworm)"
	I1124 09:19:22.437898 1701291 command_runner.go:130] > VERSION_CODENAME=bookworm
	I1124 09:19:22.437901 1701291 command_runner.go:130] > ID=debian
	I1124 09:19:22.437906 1701291 command_runner.go:130] > HOME_URL="https://www.debian.org/"
	I1124 09:19:22.437910 1701291 command_runner.go:130] > SUPPORT_URL="https://www.debian.org/support"
	I1124 09:19:22.437917 1701291 command_runner.go:130] > BUG_REPORT_URL="https://bugs.debian.org/"
	I1124 09:19:22.437980 1701291 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1124 09:19:22.437995 1701291 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1124 09:19:22.438006 1701291 filesync.go:126] Scanning /home/jenkins/minikube-integration/21978-1652607/.minikube/addons for local assets ...
	I1124 09:19:22.438064 1701291 filesync.go:126] Scanning /home/jenkins/minikube-integration/21978-1652607/.minikube/files for local assets ...
	I1124 09:19:22.438143 1701291 filesync.go:149] local asset: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem -> 16544672.pem in /etc/ssl/certs
	I1124 09:19:22.438150 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem -> /etc/ssl/certs/16544672.pem
	I1124 09:19:22.438232 1701291 filesync.go:149] local asset: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/test/nested/copy/1654467/hosts -> hosts in /etc/test/nested/copy/1654467
	I1124 09:19:22.438236 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/test/nested/copy/1654467/hosts -> /etc/test/nested/copy/1654467/hosts
	I1124 09:19:22.438277 1701291 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/1654467
	I1124 09:19:22.446265 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem --> /etc/ssl/certs/16544672.pem (1708 bytes)
	I1124 09:19:22.463769 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/test/nested/copy/1654467/hosts --> /etc/test/nested/copy/1654467/hosts (40 bytes)
	I1124 09:19:22.481365 1701291 start.go:296] duration metric: took 174.495413ms for postStartSetup
	I1124 09:19:22.481446 1701291 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1124 09:19:22.481495 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:22.498552 1701291 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:19:22.598952 1701291 command_runner.go:130] > 14%
	I1124 09:19:22.599551 1701291 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1124 09:19:22.604050 1701291 command_runner.go:130] > 168G
	I1124 09:19:22.604631 1701291 fix.go:56] duration metric: took 1.212164413s for fixHost
	I1124 09:19:22.604655 1701291 start.go:83] releasing machines lock for "functional-291288", held for 1.212220037s
	I1124 09:19:22.604753 1701291 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-291288
	I1124 09:19:22.621885 1701291 ssh_runner.go:195] Run: cat /version.json
	I1124 09:19:22.621944 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:22.622207 1701291 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1124 09:19:22.622270 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:22.640397 1701291 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:19:22.648463 1701291 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:19:22.746016 1701291 command_runner.go:130] > {"iso_version": "v1.37.0-1763503576-21924", "kicbase_version": "v0.0.48-1763789673-21948", "minikube_version": "v1.37.0", "commit": "2996c7ec74d570fa8ab37e6f4f8813150d0c7473"}
	I1124 09:19:22.746158 1701291 ssh_runner.go:195] Run: systemctl --version
	I1124 09:19:22.840219 1701291 command_runner.go:130] > <a href="https://github.com/kubernetes/registry.k8s.io">Temporary Redirect</a>.
	I1124 09:19:22.840264 1701291 command_runner.go:130] > systemd 252 (252.39-1~deb12u1)
	I1124 09:19:22.840285 1701291 command_runner.go:130] > +PAM +AUDIT +SELINUX +APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT +QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified
	I1124 09:19:22.840354 1701291 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I1124 09:19:22.844675 1701291 command_runner.go:130] ! stat: cannot statx '/etc/cni/net.d/*loopback.conf*': No such file or directory
	W1124 09:19:22.844725 1701291 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1124 09:19:22.844793 1701291 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1124 09:19:22.852461 1701291 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1124 09:19:22.852484 1701291 start.go:496] detecting cgroup driver to use...
	I1124 09:19:22.852517 1701291 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1124 09:19:22.852584 1701291 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1124 09:19:22.868240 1701291 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1124 09:19:22.881367 1701291 docker.go:218] disabling cri-docker service (if available) ...
	I1124 09:19:22.881470 1701291 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1124 09:19:22.896889 1701291 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1124 09:19:22.910017 1701291 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1124 09:19:23.028071 1701291 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1124 09:19:23.171419 1701291 docker.go:234] disabling docker service ...
	I1124 09:19:23.171539 1701291 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1124 09:19:23.187505 1701291 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1124 09:19:23.201405 1701291 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1124 09:19:23.324426 1701291 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1124 09:19:23.445186 1701291 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1124 09:19:23.457903 1701291 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1124 09:19:23.470553 1701291 command_runner.go:130] > runtime-endpoint: unix:///run/containerd/containerd.sock
	I1124 09:19:23.472034 1701291 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:19:23.623898 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1124 09:19:23.632988 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1124 09:19:23.641976 1701291 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1124 09:19:23.642063 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1124 09:19:23.651244 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1124 09:19:23.660198 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1124 09:19:23.668706 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1124 09:19:23.677261 1701291 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1124 09:19:23.685600 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1124 09:19:23.694593 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1124 09:19:23.703191 1701291 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1124 09:19:23.712006 1701291 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1124 09:19:23.718640 1701291 command_runner.go:130] > net.bridge.bridge-nf-call-iptables = 1
	I1124 09:19:23.719691 1701291 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1124 09:19:23.727172 1701291 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1124 09:19:23.844539 1701291 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1124 09:19:23.964625 1701291 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1124 09:19:23.964708 1701291 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1124 09:19:23.969624 1701291 command_runner.go:130] >   File: /run/containerd/containerd.sock
	I1124 09:19:23.969648 1701291 command_runner.go:130] >   Size: 0         	Blocks: 0          IO Block: 4096   socket
	I1124 09:19:23.969655 1701291 command_runner.go:130] > Device: 0,72	Inode: 1619        Links: 1
	I1124 09:19:23.969671 1701291 command_runner.go:130] > Access: (0660/srw-rw----)  Uid: (    0/    root)   Gid: (    0/    root)
	I1124 09:19:23.969685 1701291 command_runner.go:130] > Access: 2025-11-24 09:19:23.931843190 +0000
	I1124 09:19:23.969693 1701291 command_runner.go:130] > Modify: 2025-11-24 09:19:23.931843190 +0000
	I1124 09:19:23.969699 1701291 command_runner.go:130] > Change: 2025-11-24 09:19:23.931843190 +0000
	I1124 09:19:23.969707 1701291 command_runner.go:130] >  Birth: -
	I1124 09:19:23.970283 1701291 start.go:564] Will wait 60s for crictl version
	I1124 09:19:23.970345 1701291 ssh_runner.go:195] Run: which crictl
	I1124 09:19:23.973724 1701291 command_runner.go:130] > /usr/local/bin/crictl
	I1124 09:19:23.974288 1701291 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1124 09:19:23.995301 1701291 command_runner.go:130] > Version:  0.1.0
	I1124 09:19:23.995587 1701291 command_runner.go:130] > RuntimeName:  containerd
	I1124 09:19:23.995841 1701291 command_runner.go:130] > RuntimeVersion:  v2.1.5
	I1124 09:19:23.996049 1701291 command_runner.go:130] > RuntimeApiVersion:  v1
	I1124 09:19:23.998158 1701291 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.1.5
	RuntimeApiVersion:  v1
	I1124 09:19:23.998238 1701291 ssh_runner.go:195] Run: containerd --version
	I1124 09:19:24.020107 1701291 command_runner.go:130] > containerd containerd.io v2.1.5 fcd43222d6b07379a4be9786bda52438f0dd16a1
	I1124 09:19:24.020449 1701291 ssh_runner.go:195] Run: containerd --version
	I1124 09:19:24.041776 1701291 command_runner.go:130] > containerd containerd.io v2.1.5 fcd43222d6b07379a4be9786bda52438f0dd16a1
	I1124 09:19:24.047417 1701291 out.go:179] * Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	I1124 09:19:24.050497 1701291 cli_runner.go:164] Run: docker network inspect functional-291288 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1124 09:19:24.067531 1701291 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1124 09:19:24.071507 1701291 command_runner.go:130] > 192.168.49.1	host.minikube.internal
	I1124 09:19:24.071622 1701291 kubeadm.go:884] updating cluster {Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikub
eCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:fal
se CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1124 09:19:24.071797 1701291 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:19:24.253230 1701291 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:19:24.402285 1701291 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:19:24.552419 1701291 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1124 09:19:24.552515 1701291 ssh_runner.go:195] Run: sudo crictl images --output json
	I1124 09:19:24.577200 1701291 command_runner.go:130] > {
	I1124 09:19:24.577221 1701291 command_runner.go:130] >   "images":  [
	I1124 09:19:24.577226 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577235 1701291 command_runner.go:130] >       "id":  "sha256:66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51",
	I1124 09:19:24.577240 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577245 1701291 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1124 09:19:24.577248 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577252 1701291 command_runner.go:130] >       "repoDigests":  [],
	I1124 09:19:24.577256 1701291 command_runner.go:130] >       "size":  "8032639",
	I1124 09:19:24.577264 1701291 command_runner.go:130] >       "username":  "",
	I1124 09:19:24.577269 1701291 command_runner.go:130] >       "pinned":  false
	I1124 09:19:24.577272 1701291 command_runner.go:130] >     },
	I1124 09:19:24.577276 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577283 1701291 command_runner.go:130] >       "id":  "sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1124 09:19:24.577290 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577296 1701291 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1124 09:19:24.577299 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577308 1701291 command_runner.go:130] >       "repoDigests":  [],
	I1124 09:19:24.577330 1701291 command_runner.go:130] >       "size":  "21166088",
	I1124 09:19:24.577335 1701291 command_runner.go:130] >       "username":  "nonroot",
	I1124 09:19:24.577339 1701291 command_runner.go:130] >       "pinned":  false
	I1124 09:19:24.577349 1701291 command_runner.go:130] >     },
	I1124 09:19:24.577357 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577364 1701291 command_runner.go:130] >       "id":  "sha256:1211402d28f5813ed906916bfcdd0a7404c2f9048ef5bb54387a6745bc410eca",
	I1124 09:19:24.577368 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577373 1701291 command_runner.go:130] >         "registry.k8s.io/etcd:3.5.24-0"
	I1124 09:19:24.577376 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577380 1701291 command_runner.go:130] >       "repoDigests":  [],
	I1124 09:19:24.577384 1701291 command_runner.go:130] >       "size":  "21880804",
	I1124 09:19:24.577391 1701291 command_runner.go:130] >       "uid":  {
	I1124 09:19:24.577395 1701291 command_runner.go:130] >         "value":  "0"
	I1124 09:19:24.577400 1701291 command_runner.go:130] >       },
	I1124 09:19:24.577404 1701291 command_runner.go:130] >       "username":  "",
	I1124 09:19:24.577408 1701291 command_runner.go:130] >       "pinned":  false
	I1124 09:19:24.577421 1701291 command_runner.go:130] >     },
	I1124 09:19:24.577424 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577431 1701291 command_runner.go:130] >       "id":  "sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42",
	I1124 09:19:24.577434 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577443 1701291 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.5-0"
	I1124 09:19:24.577450 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577454 1701291 command_runner.go:130] >       "repoDigests":  [
	I1124 09:19:24.577461 1701291 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534"
	I1124 09:19:24.577465 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577469 1701291 command_runner.go:130] >       "size":  "21136588",
	I1124 09:19:24.577472 1701291 command_runner.go:130] >       "uid":  {
	I1124 09:19:24.577479 1701291 command_runner.go:130] >         "value":  "0"
	I1124 09:19:24.577482 1701291 command_runner.go:130] >       },
	I1124 09:19:24.577486 1701291 command_runner.go:130] >       "username":  "",
	I1124 09:19:24.577492 1701291 command_runner.go:130] >       "pinned":  false
	I1124 09:19:24.577495 1701291 command_runner.go:130] >     },
	I1124 09:19:24.577502 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577512 1701291 command_runner.go:130] >       "id":  "sha256:ccd634d9bcc36ac6235e9c86676cd3a02c06afc3788a25f1bbf39ca7d44585f4",
	I1124 09:19:24.577516 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577521 1701291 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-beta.0"
	I1124 09:19:24.577527 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577531 1701291 command_runner.go:130] >       "repoDigests":  [],
	I1124 09:19:24.577535 1701291 command_runner.go:130] >       "size":  "24676285",
	I1124 09:19:24.577538 1701291 command_runner.go:130] >       "uid":  {
	I1124 09:19:24.577541 1701291 command_runner.go:130] >         "value":  "0"
	I1124 09:19:24.577545 1701291 command_runner.go:130] >       },
	I1124 09:19:24.577550 1701291 command_runner.go:130] >       "username":  "",
	I1124 09:19:24.577556 1701291 command_runner.go:130] >       "pinned":  false
	I1124 09:19:24.577560 1701291 command_runner.go:130] >     },
	I1124 09:19:24.577563 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577569 1701291 command_runner.go:130] >       "id":  "sha256:68b5f775f18769fcb77bd8474c80bda2050163b6c66f4551f352b7381b8ca5be",
	I1124 09:19:24.577581 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577586 1701291 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0"
	I1124 09:19:24.577590 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577594 1701291 command_runner.go:130] >       "repoDigests":  [],
	I1124 09:19:24.577605 1701291 command_runner.go:130] >       "size":  "20658969",
	I1124 09:19:24.577608 1701291 command_runner.go:130] >       "uid":  {
	I1124 09:19:24.577612 1701291 command_runner.go:130] >         "value":  "0"
	I1124 09:19:24.577615 1701291 command_runner.go:130] >       },
	I1124 09:19:24.577619 1701291 command_runner.go:130] >       "username":  "",
	I1124 09:19:24.577624 1701291 command_runner.go:130] >       "pinned":  false
	I1124 09:19:24.577629 1701291 command_runner.go:130] >     },
	I1124 09:19:24.577633 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577644 1701291 command_runner.go:130] >       "id":  "sha256:404c2e12861777b763b8feaa316d36680fc68ad308a8d2f6e55f1bb981cdd904",
	I1124 09:19:24.577655 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577660 1701291 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-beta.0"
	I1124 09:19:24.577663 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577667 1701291 command_runner.go:130] >       "repoDigests":  [],
	I1124 09:19:24.577678 1701291 command_runner.go:130] >       "size":  "22428165",
	I1124 09:19:24.577686 1701291 command_runner.go:130] >       "username":  "",
	I1124 09:19:24.577692 1701291 command_runner.go:130] >       "pinned":  false
	I1124 09:19:24.577696 1701291 command_runner.go:130] >     },
	I1124 09:19:24.577706 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577712 1701291 command_runner.go:130] >       "id":  "sha256:16378741539f1be9c6e347d127537d379a6592587b09b4eb47964cb5c43a409b",
	I1124 09:19:24.577716 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577721 1701291 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-beta.0"
	I1124 09:19:24.577724 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577728 1701291 command_runner.go:130] >       "repoDigests":  [],
	I1124 09:19:24.577738 1701291 command_runner.go:130] >       "size":  "15389290",
	I1124 09:19:24.577744 1701291 command_runner.go:130] >       "uid":  {
	I1124 09:19:24.577751 1701291 command_runner.go:130] >         "value":  "0"
	I1124 09:19:24.577754 1701291 command_runner.go:130] >       },
	I1124 09:19:24.577758 1701291 command_runner.go:130] >       "username":  "",
	I1124 09:19:24.577762 1701291 command_runner.go:130] >       "pinned":  false
	I1124 09:19:24.577768 1701291 command_runner.go:130] >     },
	I1124 09:19:24.577771 1701291 command_runner.go:130] >     {
	I1124 09:19:24.577779 1701291 command_runner.go:130] >       "id":  "sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1124 09:19:24.577786 1701291 command_runner.go:130] >       "repoTags":  [
	I1124 09:19:24.577791 1701291 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1124 09:19:24.577794 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.577797 1701291 command_runner.go:130] >       "repoDigests":  [],
	I1124 09:19:24.577801 1701291 command_runner.go:130] >       "size":  "265458",
	I1124 09:19:24.577805 1701291 command_runner.go:130] >       "uid":  {
	I1124 09:19:24.577809 1701291 command_runner.go:130] >         "value":  "65535"
	I1124 09:19:24.577815 1701291 command_runner.go:130] >       },
	I1124 09:19:24.577819 1701291 command_runner.go:130] >       "username":  "",
	I1124 09:19:24.577824 1701291 command_runner.go:130] >       "pinned":  true
	I1124 09:19:24.577827 1701291 command_runner.go:130] >     }
	I1124 09:19:24.577831 1701291 command_runner.go:130] >   ]
	I1124 09:19:24.577842 1701291 command_runner.go:130] > }
	I1124 09:19:24.577988 1701291 containerd.go:627] all images are preloaded for containerd runtime.
	I1124 09:19:24.578000 1701291 cache_images.go:86] Images are preloaded, skipping loading
	I1124 09:19:24.578012 1701291 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-beta.0 containerd true true} ...
	I1124 09:19:24.578111 1701291 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-291288 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1124 09:19:24.578176 1701291 ssh_runner.go:195] Run: sudo crictl info
	I1124 09:19:24.601872 1701291 command_runner.go:130] > {
	I1124 09:19:24.601895 1701291 command_runner.go:130] >   "cniconfig": {
	I1124 09:19:24.601901 1701291 command_runner.go:130] >     "Networks": [
	I1124 09:19:24.601905 1701291 command_runner.go:130] >       {
	I1124 09:19:24.601909 1701291 command_runner.go:130] >         "Config": {
	I1124 09:19:24.601914 1701291 command_runner.go:130] >           "CNIVersion": "0.3.1",
	I1124 09:19:24.601919 1701291 command_runner.go:130] >           "Name": "cni-loopback",
	I1124 09:19:24.601924 1701291 command_runner.go:130] >           "Plugins": [
	I1124 09:19:24.601927 1701291 command_runner.go:130] >             {
	I1124 09:19:24.601931 1701291 command_runner.go:130] >               "Network": {
	I1124 09:19:24.601935 1701291 command_runner.go:130] >                 "ipam": {},
	I1124 09:19:24.601941 1701291 command_runner.go:130] >                 "type": "loopback"
	I1124 09:19:24.601945 1701291 command_runner.go:130] >               },
	I1124 09:19:24.601958 1701291 command_runner.go:130] >               "Source": "{\"type\":\"loopback\"}"
	I1124 09:19:24.601965 1701291 command_runner.go:130] >             }
	I1124 09:19:24.601969 1701291 command_runner.go:130] >           ],
	I1124 09:19:24.601979 1701291 command_runner.go:130] >           "Source": "{\n\"cniVersion\": \"0.3.1\",\n\"name\": \"cni-loopback\",\n\"plugins\": [{\n  \"type\": \"loopback\"\n}]\n}"
	I1124 09:19:24.601983 1701291 command_runner.go:130] >         },
	I1124 09:19:24.601991 1701291 command_runner.go:130] >         "IFName": "lo"
	I1124 09:19:24.601994 1701291 command_runner.go:130] >       }
	I1124 09:19:24.601997 1701291 command_runner.go:130] >     ],
	I1124 09:19:24.602003 1701291 command_runner.go:130] >     "PluginConfDir": "/etc/cni/net.d",
	I1124 09:19:24.602007 1701291 command_runner.go:130] >     "PluginDirs": [
	I1124 09:19:24.602014 1701291 command_runner.go:130] >       "/opt/cni/bin"
	I1124 09:19:24.602018 1701291 command_runner.go:130] >     ],
	I1124 09:19:24.602026 1701291 command_runner.go:130] >     "PluginMaxConfNum": 1,
	I1124 09:19:24.602030 1701291 command_runner.go:130] >     "Prefix": "eth"
	I1124 09:19:24.602033 1701291 command_runner.go:130] >   },
	I1124 09:19:24.602037 1701291 command_runner.go:130] >   "config": {
	I1124 09:19:24.602041 1701291 command_runner.go:130] >     "cdiSpecDirs": [
	I1124 09:19:24.602048 1701291 command_runner.go:130] >       "/etc/cdi",
	I1124 09:19:24.602051 1701291 command_runner.go:130] >       "/var/run/cdi"
	I1124 09:19:24.602055 1701291 command_runner.go:130] >     ],
	I1124 09:19:24.602069 1701291 command_runner.go:130] >     "cni": {
	I1124 09:19:24.602073 1701291 command_runner.go:130] >       "binDir": "",
	I1124 09:19:24.602076 1701291 command_runner.go:130] >       "binDirs": [
	I1124 09:19:24.602080 1701291 command_runner.go:130] >         "/opt/cni/bin"
	I1124 09:19:24.602083 1701291 command_runner.go:130] >       ],
	I1124 09:19:24.602087 1701291 command_runner.go:130] >       "confDir": "/etc/cni/net.d",
	I1124 09:19:24.602092 1701291 command_runner.go:130] >       "confTemplate": "",
	I1124 09:19:24.602098 1701291 command_runner.go:130] >       "ipPref": "",
	I1124 09:19:24.602103 1701291 command_runner.go:130] >       "maxConfNum": 1,
	I1124 09:19:24.602109 1701291 command_runner.go:130] >       "setupSerially": false,
	I1124 09:19:24.602114 1701291 command_runner.go:130] >       "useInternalLoopback": false
	I1124 09:19:24.602120 1701291 command_runner.go:130] >     },
	I1124 09:19:24.602126 1701291 command_runner.go:130] >     "containerd": {
	I1124 09:19:24.602132 1701291 command_runner.go:130] >       "defaultRuntimeName": "runc",
	I1124 09:19:24.602137 1701291 command_runner.go:130] >       "ignoreBlockIONotEnabledErrors": false,
	I1124 09:19:24.602145 1701291 command_runner.go:130] >       "ignoreRdtNotEnabledErrors": false,
	I1124 09:19:24.602149 1701291 command_runner.go:130] >       "runtimes": {
	I1124 09:19:24.602152 1701291 command_runner.go:130] >         "runc": {
	I1124 09:19:24.602157 1701291 command_runner.go:130] >           "ContainerAnnotations": null,
	I1124 09:19:24.602163 1701291 command_runner.go:130] >           "PodAnnotations": null,
	I1124 09:19:24.602169 1701291 command_runner.go:130] >           "baseRuntimeSpec": "",
	I1124 09:19:24.602174 1701291 command_runner.go:130] >           "cgroupWritable": false,
	I1124 09:19:24.602179 1701291 command_runner.go:130] >           "cniConfDir": "",
	I1124 09:19:24.602185 1701291 command_runner.go:130] >           "cniMaxConfNum": 0,
	I1124 09:19:24.602190 1701291 command_runner.go:130] >           "io_type": "",
	I1124 09:19:24.602195 1701291 command_runner.go:130] >           "options": {
	I1124 09:19:24.602200 1701291 command_runner.go:130] >             "BinaryName": "",
	I1124 09:19:24.602212 1701291 command_runner.go:130] >             "CriuImagePath": "",
	I1124 09:19:24.602217 1701291 command_runner.go:130] >             "CriuWorkPath": "",
	I1124 09:19:24.602221 1701291 command_runner.go:130] >             "IoGid": 0,
	I1124 09:19:24.602226 1701291 command_runner.go:130] >             "IoUid": 0,
	I1124 09:19:24.602232 1701291 command_runner.go:130] >             "NoNewKeyring": false,
	I1124 09:19:24.602237 1701291 command_runner.go:130] >             "Root": "",
	I1124 09:19:24.602243 1701291 command_runner.go:130] >             "ShimCgroup": "",
	I1124 09:19:24.602248 1701291 command_runner.go:130] >             "SystemdCgroup": false
	I1124 09:19:24.602252 1701291 command_runner.go:130] >           },
	I1124 09:19:24.602257 1701291 command_runner.go:130] >           "privileged_without_host_devices": false,
	I1124 09:19:24.602266 1701291 command_runner.go:130] >           "privileged_without_host_devices_all_devices_allowed": false,
	I1124 09:19:24.602272 1701291 command_runner.go:130] >           "runtimePath": "",
	I1124 09:19:24.602278 1701291 command_runner.go:130] >           "runtimeType": "io.containerd.runc.v2",
	I1124 09:19:24.602285 1701291 command_runner.go:130] >           "sandboxer": "podsandbox",
	I1124 09:19:24.602290 1701291 command_runner.go:130] >           "snapshotter": ""
	I1124 09:19:24.602293 1701291 command_runner.go:130] >         }
	I1124 09:19:24.602296 1701291 command_runner.go:130] >       }
	I1124 09:19:24.602299 1701291 command_runner.go:130] >     },
	I1124 09:19:24.602309 1701291 command_runner.go:130] >     "containerdEndpoint": "/run/containerd/containerd.sock",
	I1124 09:19:24.602332 1701291 command_runner.go:130] >     "containerdRootDir": "/var/lib/containerd",
	I1124 09:19:24.602339 1701291 command_runner.go:130] >     "device_ownership_from_security_context": false,
	I1124 09:19:24.602344 1701291 command_runner.go:130] >     "disableApparmor": false,
	I1124 09:19:24.602351 1701291 command_runner.go:130] >     "disableHugetlbController": true,
	I1124 09:19:24.602355 1701291 command_runner.go:130] >     "disableProcMount": false,
	I1124 09:19:24.602362 1701291 command_runner.go:130] >     "drainExecSyncIOTimeout": "0s",
	I1124 09:19:24.602366 1701291 command_runner.go:130] >     "enableCDI": true,
	I1124 09:19:24.602378 1701291 command_runner.go:130] >     "enableSelinux": false,
	I1124 09:19:24.602382 1701291 command_runner.go:130] >     "enableUnprivilegedICMP": true,
	I1124 09:19:24.602387 1701291 command_runner.go:130] >     "enableUnprivilegedPorts": true,
	I1124 09:19:24.602392 1701291 command_runner.go:130] >     "ignoreDeprecationWarnings": null,
	I1124 09:19:24.602403 1701291 command_runner.go:130] >     "ignoreImageDefinedVolumes": false,
	I1124 09:19:24.602408 1701291 command_runner.go:130] >     "maxContainerLogLineSize": 16384,
	I1124 09:19:24.602413 1701291 command_runner.go:130] >     "netnsMountsUnderStateDir": false,
	I1124 09:19:24.602417 1701291 command_runner.go:130] >     "restrictOOMScoreAdj": false,
	I1124 09:19:24.602422 1701291 command_runner.go:130] >     "rootDir": "/var/lib/containerd/io.containerd.grpc.v1.cri",
	I1124 09:19:24.602427 1701291 command_runner.go:130] >     "selinuxCategoryRange": 1024,
	I1124 09:19:24.602432 1701291 command_runner.go:130] >     "stateDir": "/run/containerd/io.containerd.grpc.v1.cri",
	I1124 09:19:24.602438 1701291 command_runner.go:130] >     "tolerateMissingHugetlbController": true,
	I1124 09:19:24.602441 1701291 command_runner.go:130] >     "unsetSeccompProfile": ""
	I1124 09:19:24.602445 1701291 command_runner.go:130] >   },
	I1124 09:19:24.602449 1701291 command_runner.go:130] >   "features": {
	I1124 09:19:24.602492 1701291 command_runner.go:130] >     "supplemental_groups_policy": true
	I1124 09:19:24.602500 1701291 command_runner.go:130] >   },
	I1124 09:19:24.602504 1701291 command_runner.go:130] >   "golang": "go1.24.9",
	I1124 09:19:24.602513 1701291 command_runner.go:130] >   "lastCNILoadStatus": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1124 09:19:24.602527 1701291 command_runner.go:130] >   "lastCNILoadStatus.default": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1124 09:19:24.602532 1701291 command_runner.go:130] >   "runtimeHandlers": [
	I1124 09:19:24.602537 1701291 command_runner.go:130] >     {
	I1124 09:19:24.602541 1701291 command_runner.go:130] >       "features": {
	I1124 09:19:24.602546 1701291 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1124 09:19:24.602550 1701291 command_runner.go:130] >         "user_namespaces": true
	I1124 09:19:24.602555 1701291 command_runner.go:130] >       }
	I1124 09:19:24.602564 1701291 command_runner.go:130] >     },
	I1124 09:19:24.602570 1701291 command_runner.go:130] >     {
	I1124 09:19:24.602575 1701291 command_runner.go:130] >       "features": {
	I1124 09:19:24.602587 1701291 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1124 09:19:24.602592 1701291 command_runner.go:130] >         "user_namespaces": true
	I1124 09:19:24.602595 1701291 command_runner.go:130] >       },
	I1124 09:19:24.602598 1701291 command_runner.go:130] >       "name": "runc"
	I1124 09:19:24.602609 1701291 command_runner.go:130] >     }
	I1124 09:19:24.602612 1701291 command_runner.go:130] >   ],
	I1124 09:19:24.602615 1701291 command_runner.go:130] >   "status": {
	I1124 09:19:24.602619 1701291 command_runner.go:130] >     "conditions": [
	I1124 09:19:24.602623 1701291 command_runner.go:130] >       {
	I1124 09:19:24.602629 1701291 command_runner.go:130] >         "message": "",
	I1124 09:19:24.602633 1701291 command_runner.go:130] >         "reason": "",
	I1124 09:19:24.602637 1701291 command_runner.go:130] >         "status": true,
	I1124 09:19:24.602641 1701291 command_runner.go:130] >         "type": "RuntimeReady"
	I1124 09:19:24.602645 1701291 command_runner.go:130] >       },
	I1124 09:19:24.602648 1701291 command_runner.go:130] >       {
	I1124 09:19:24.602655 1701291 command_runner.go:130] >         "message": "Network plugin returns error: cni plugin not initialized",
	I1124 09:19:24.602662 1701291 command_runner.go:130] >         "reason": "NetworkPluginNotReady",
	I1124 09:19:24.602666 1701291 command_runner.go:130] >         "status": false,
	I1124 09:19:24.602678 1701291 command_runner.go:130] >         "type": "NetworkReady"
	I1124 09:19:24.602682 1701291 command_runner.go:130] >       },
	I1124 09:19:24.602685 1701291 command_runner.go:130] >       {
	I1124 09:19:24.602688 1701291 command_runner.go:130] >         "message": "",
	I1124 09:19:24.602692 1701291 command_runner.go:130] >         "reason": "",
	I1124 09:19:24.602703 1701291 command_runner.go:130] >         "status": true,
	I1124 09:19:24.602709 1701291 command_runner.go:130] >         "type": "ContainerdHasNoDeprecationWarnings"
	I1124 09:19:24.602712 1701291 command_runner.go:130] >       }
	I1124 09:19:24.602715 1701291 command_runner.go:130] >     ]
	I1124 09:19:24.602718 1701291 command_runner.go:130] >   }
	I1124 09:19:24.602721 1701291 command_runner.go:130] > }
	I1124 09:19:24.603033 1701291 cni.go:84] Creating CNI manager for ""
	I1124 09:19:24.603051 1701291 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1124 09:19:24.603074 1701291 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1124 09:19:24.603102 1701291 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-beta.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-291288 NodeName:functional-291288 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt Sta
ticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1124 09:19:24.603228 1701291 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-291288"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-beta.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1124 09:19:24.603309 1701291 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0
	I1124 09:19:24.611119 1701291 command_runner.go:130] > kubeadm
	I1124 09:19:24.611140 1701291 command_runner.go:130] > kubectl
	I1124 09:19:24.611146 1701291 command_runner.go:130] > kubelet
	I1124 09:19:24.611161 1701291 binaries.go:51] Found k8s binaries, skipping transfer
	I1124 09:19:24.611223 1701291 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1124 09:19:24.618883 1701291 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (328 bytes)
	I1124 09:19:24.633448 1701291 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (359 bytes)
	I1124 09:19:24.650072 1701291 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2237 bytes)
	I1124 09:19:24.664688 1701291 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1124 09:19:24.668362 1701291 command_runner.go:130] > 192.168.49.2	control-plane.minikube.internal
	I1124 09:19:24.668996 1701291 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1124 09:19:24.787731 1701291 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1124 09:19:25.630718 1701291 certs.go:69] Setting up /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288 for IP: 192.168.49.2
	I1124 09:19:25.630736 1701291 certs.go:195] generating shared ca certs ...
	I1124 09:19:25.630751 1701291 certs.go:227] acquiring lock for ca certs: {Name:mkbe540a30c4376a351176f7fe6fec044d058b09 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 09:19:25.630878 1701291 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.key
	I1124 09:19:25.630932 1701291 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.key
	I1124 09:19:25.630939 1701291 certs.go:257] generating profile certs ...
	I1124 09:19:25.631060 1701291 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.key
	I1124 09:19:25.631119 1701291 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.key.5acb2515
	I1124 09:19:25.631156 1701291 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.key
	I1124 09:19:25.631166 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I1124 09:19:25.631180 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I1124 09:19:25.631190 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I1124 09:19:25.631200 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I1124 09:19:25.631210 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I1124 09:19:25.631221 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I1124 09:19:25.631231 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I1124 09:19:25.631241 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I1124 09:19:25.631304 1701291 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467.pem (1338 bytes)
	W1124 09:19:25.631338 1701291 certs.go:480] ignoring /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467_empty.pem, impossibly tiny 0 bytes
	I1124 09:19:25.631352 1701291 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem (1671 bytes)
	I1124 09:19:25.631382 1701291 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem (1078 bytes)
	I1124 09:19:25.631410 1701291 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem (1123 bytes)
	I1124 09:19:25.631434 1701291 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem (1679 bytes)
	I1124 09:19:25.631484 1701291 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem (1708 bytes)
	I1124 09:19:25.631512 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:19:25.631529 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467.pem -> /usr/share/ca-certificates/1654467.pem
	I1124 09:19:25.631542 1701291 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem -> /usr/share/ca-certificates/16544672.pem
	I1124 09:19:25.632117 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1124 09:19:25.653566 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1124 09:19:25.672677 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1124 09:19:25.692448 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1124 09:19:25.712758 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1124 09:19:25.730246 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1124 09:19:25.748136 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1124 09:19:25.765102 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1124 09:19:25.782676 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1124 09:19:25.800418 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467.pem --> /usr/share/ca-certificates/1654467.pem (1338 bytes)
	I1124 09:19:25.818179 1701291 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem --> /usr/share/ca-certificates/16544672.pem (1708 bytes)
	I1124 09:19:25.836420 1701291 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1124 09:19:25.849273 1701291 ssh_runner.go:195] Run: openssl version
	I1124 09:19:25.855675 1701291 command_runner.go:130] > OpenSSL 3.0.17 1 Jul 2025 (Library: OpenSSL 3.0.17 1 Jul 2025)
	I1124 09:19:25.855803 1701291 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1124 09:19:25.864243 1701291 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:19:25.867919 1701291 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Nov 24 08:44 /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:19:25.867982 1701291 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Nov 24 08:44 /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:19:25.868042 1701291 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:19:25.908611 1701291 command_runner.go:130] > b5213941
	I1124 09:19:25.909123 1701291 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1124 09:19:25.916880 1701291 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1654467.pem && ln -fs /usr/share/ca-certificates/1654467.pem /etc/ssl/certs/1654467.pem"
	I1124 09:19:25.925097 1701291 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1654467.pem
	I1124 09:19:25.928711 1701291 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Nov 24 09:10 /usr/share/ca-certificates/1654467.pem
	I1124 09:19:25.928823 1701291 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Nov 24 09:10 /usr/share/ca-certificates/1654467.pem
	I1124 09:19:25.928900 1701291 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1654467.pem
	I1124 09:19:25.969833 1701291 command_runner.go:130] > 51391683
	I1124 09:19:25.970298 1701291 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1654467.pem /etc/ssl/certs/51391683.0"
	I1124 09:19:25.978202 1701291 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/16544672.pem && ln -fs /usr/share/ca-certificates/16544672.pem /etc/ssl/certs/16544672.pem"
	I1124 09:19:25.986297 1701291 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/16544672.pem
	I1124 09:19:25.989958 1701291 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Nov 24 09:10 /usr/share/ca-certificates/16544672.pem
	I1124 09:19:25.990028 1701291 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Nov 24 09:10 /usr/share/ca-certificates/16544672.pem
	I1124 09:19:25.990094 1701291 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/16544672.pem
	I1124 09:19:26.030947 1701291 command_runner.go:130] > 3ec20f2e
	I1124 09:19:26.031428 1701291 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/16544672.pem /etc/ssl/certs/3ec20f2e.0"
	I1124 09:19:26.039972 1701291 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1124 09:19:26.043966 1701291 command_runner.go:130] >   File: /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1124 09:19:26.043995 1701291 command_runner.go:130] >   Size: 1176      	Blocks: 8          IO Block: 4096   regular file
	I1124 09:19:26.044001 1701291 command_runner.go:130] > Device: 259,1	Inode: 1320367     Links: 1
	I1124 09:19:26.044008 1701291 command_runner.go:130] > Access: (0644/-rw-r--r--)  Uid: (    0/    root)   Gid: (    0/    root)
	I1124 09:19:26.044023 1701291 command_runner.go:130] > Access: 2025-11-24 09:15:17.409446871 +0000
	I1124 09:19:26.044028 1701291 command_runner.go:130] > Modify: 2025-11-24 09:11:12.722825550 +0000
	I1124 09:19:26.044034 1701291 command_runner.go:130] > Change: 2025-11-24 09:11:12.722825550 +0000
	I1124 09:19:26.044039 1701291 command_runner.go:130] >  Birth: 2025-11-24 09:11:12.722825550 +0000
	I1124 09:19:26.044132 1701291 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1124 09:19:26.086676 1701291 command_runner.go:130] > Certificate will not expire
	I1124 09:19:26.086876 1701291 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1124 09:19:26.129915 1701291 command_runner.go:130] > Certificate will not expire
	I1124 09:19:26.130020 1701291 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1124 09:19:26.173544 1701291 command_runner.go:130] > Certificate will not expire
	I1124 09:19:26.174084 1701291 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1124 09:19:26.214370 1701291 command_runner.go:130] > Certificate will not expire
	I1124 09:19:26.214874 1701291 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1124 09:19:26.257535 1701291 command_runner.go:130] > Certificate will not expire
	I1124 09:19:26.257999 1701291 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1124 09:19:26.298467 1701291 command_runner.go:130] > Certificate will not expire
	I1124 09:19:26.298937 1701291 kubeadm.go:401] StartCluster: {Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikubeCA
APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false
CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:19:26.299045 1701291 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1124 09:19:26.299146 1701291 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1124 09:19:26.324900 1701291 cri.go:89] found id: ""
	I1124 09:19:26.325047 1701291 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1124 09:19:26.331898 1701291 command_runner.go:130] > /var/lib/kubelet/config.yaml
	I1124 09:19:26.331976 1701291 command_runner.go:130] > /var/lib/kubelet/kubeadm-flags.env
	I1124 09:19:26.331999 1701291 command_runner.go:130] > /var/lib/minikube/etcd:
	I1124 09:19:26.332730 1701291 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1124 09:19:26.332771 1701291 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1124 09:19:26.332851 1701291 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1124 09:19:26.340023 1701291 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1124 09:19:26.340455 1701291 kubeconfig.go:47] verify endpoint returned: get endpoint: "functional-291288" does not appear in /home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 09:19:26.340556 1701291 kubeconfig.go:62] /home/jenkins/minikube-integration/21978-1652607/kubeconfig needs updating (will repair): [kubeconfig missing "functional-291288" cluster setting kubeconfig missing "functional-291288" context setting]
	I1124 09:19:26.340827 1701291 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/kubeconfig: {Name:mk02121ae6148bede61eabf0ed4e1826024715f4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 09:19:26.341245 1701291 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 09:19:26.341410 1701291 kapi.go:59] client config for functional-291288: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.crt", KeyFile:"/home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.key", CAFile:"/home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(ni
l), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb2df0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1124 09:19:26.341966 1701291 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1124 09:19:26.341987 1701291 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1124 09:19:26.341993 1701291 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1124 09:19:26.341999 1701291 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1124 09:19:26.342005 1701291 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1124 09:19:26.342302 1701291 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1124 09:19:26.342404 1701291 cert_rotation.go:141] "Starting client certificate rotation controller" logger="tls-transport-cache"
	I1124 09:19:26.349720 1701291 kubeadm.go:635] The running cluster does not require reconfiguration: 192.168.49.2
	I1124 09:19:26.349757 1701291 kubeadm.go:602] duration metric: took 16.96677ms to restartPrimaryControlPlane
	I1124 09:19:26.349768 1701291 kubeadm.go:403] duration metric: took 50.840633ms to StartCluster
	I1124 09:19:26.349802 1701291 settings.go:142] acquiring lock: {Name:mk6c04793f5fd4f38f92abf4357247f2ccd7fc4e Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 09:19:26.349888 1701291 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 09:19:26.350548 1701291 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/kubeconfig: {Name:mk02121ae6148bede61eabf0ed4e1826024715f4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 09:19:26.350757 1701291 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1124 09:19:26.351051 1701291 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1124 09:19:26.351103 1701291 addons.go:527] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I1124 09:19:26.351171 1701291 addons.go:70] Setting storage-provisioner=true in profile "functional-291288"
	I1124 09:19:26.351184 1701291 addons.go:239] Setting addon storage-provisioner=true in "functional-291288"
	I1124 09:19:26.351210 1701291 host.go:66] Checking if "functional-291288" exists ...
	I1124 09:19:26.351260 1701291 addons.go:70] Setting default-storageclass=true in profile "functional-291288"
	I1124 09:19:26.351281 1701291 addons_storage_classes.go:34] enableOrDisableStorageClasses default-storageclass=true on "functional-291288"
	I1124 09:19:26.351591 1701291 cli_runner.go:164] Run: docker container inspect functional-291288 --format={{.State.Status}}
	I1124 09:19:26.351665 1701291 cli_runner.go:164] Run: docker container inspect functional-291288 --format={{.State.Status}}
	I1124 09:19:26.356026 1701291 out.go:179] * Verifying Kubernetes components...
	I1124 09:19:26.358753 1701291 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1124 09:19:26.386934 1701291 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 09:19:26.387124 1701291 kapi.go:59] client config for functional-291288: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.crt", KeyFile:"/home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.key", CAFile:"/home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(ni
l), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb2df0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1124 09:19:26.387397 1701291 addons.go:239] Setting addon default-storageclass=true in "functional-291288"
	I1124 09:19:26.387423 1701291 host.go:66] Checking if "functional-291288" exists ...
	I1124 09:19:26.387832 1701291 cli_runner.go:164] Run: docker container inspect functional-291288 --format={{.State.Status}}
	I1124 09:19:26.389901 1701291 out.go:179]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1124 09:19:26.395008 1701291 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:26.395037 1701291 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1124 09:19:26.395101 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:26.420232 1701291 addons.go:436] installing /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:26.420253 1701291 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1124 09:19:26.420313 1701291 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:19:26.425570 1701291 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:19:26.456516 1701291 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:19:26.560922 1701291 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1124 09:19:26.576856 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:26.613035 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:27.382844 1701291 node_ready.go:35] waiting up to 6m0s for node "functional-291288" to be "Ready" ...
	I1124 09:19:27.383045 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:27.383222 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:27.383136 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:27.383333 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:27.383470 1701291 retry.go:31] will retry after 330.402351ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:27.383574 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:27.383622 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:27.383641 1701291 retry.go:31] will retry after 362.15201ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:27.383749 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:27.714181 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:27.746972 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:27.795758 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:27.795808 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:27.795853 1701291 retry.go:31] will retry after 486.739155ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:27.825835 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:27.825930 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:27.825968 1701291 retry.go:31] will retry after 300.110995ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:27.884058 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:27.884183 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:27.884499 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:28.126983 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:28.217006 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:28.217052 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:28.217072 1701291 retry.go:31] will retry after 300.765079ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:28.283248 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:28.347318 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:28.347417 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:28.347441 1701291 retry.go:31] will retry after 303.335388ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:28.383528 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:28.383642 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:28.383982 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:28.518292 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:28.580592 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:28.580640 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:28.580660 1701291 retry.go:31] will retry after 1.066338993s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:28.651903 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:28.713844 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:28.713897 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:28.713918 1701291 retry.go:31] will retry after 1.056665241s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:28.884118 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:28.884220 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:28.884569 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:29.383298 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:29.383424 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:29.383770 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:29.383848 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:29.647985 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:29.716805 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:29.720169 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:29.720200 1701291 retry.go:31] will retry after 944.131514ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:29.771443 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:29.838798 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:29.842880 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:29.842911 1701291 retry.go:31] will retry after 1.275018698s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:29.883132 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:29.883209 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:29.883509 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:30.383649 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:30.383776 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:30.384127 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:30.664505 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:30.720036 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:30.723467 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:30.723535 1701291 retry.go:31] will retry after 2.138623105s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:30.883817 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:30.883887 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:30.884224 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:31.118957 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:31.199799 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:31.199840 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:31.199882 1701291 retry.go:31] will retry after 2.182241097s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:31.383252 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:31.383376 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:31.383741 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:31.883141 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:31.883218 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:31.883484 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:31.883535 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:32.383203 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:32.383282 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:32.383615 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:32.863283 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:32.883678 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:32.883784 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:32.884128 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:32.923038 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:32.923079 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:32.923098 1701291 retry.go:31] will retry after 3.572603171s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:33.382308 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:33.383761 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:33.383826 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:33.384119 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:33.453074 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:33.453119 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:33.453141 1701291 retry.go:31] will retry after 3.109489242s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:33.883699 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:33.883773 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:33.884102 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:33.884157 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:34.383924 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:34.383999 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:34.384345 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:34.883591 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:34.883679 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:34.883980 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:35.383814 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:35.383894 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:35.384241 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:35.884036 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:35.884171 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:35.884537 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:35.884594 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:36.383696 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:36.383766 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:36.384025 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:36.496437 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:36.551663 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:36.555562 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:36.555638 1701291 retry.go:31] will retry after 5.073494199s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:36.562783 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:36.628271 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:36.628317 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:36.628342 1701291 retry.go:31] will retry after 5.770336946s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:36.883845 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:36.883918 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:36.884243 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:37.384077 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:37.384153 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:37.384472 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:37.883154 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:37.883226 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:37.883536 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:38.383157 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:38.383232 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:38.383563 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:38.383620 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:38.883187 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:38.883316 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:38.883616 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:39.383889 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:39.383969 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:39.384246 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:39.884093 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:39.884182 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:39.884521 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:40.383195 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:40.383272 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:40.383608 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:40.383663 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:40.883068 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:40.883144 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:40.883421 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:41.383174 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:41.383248 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:41.383670 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:41.630088 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:41.704671 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:41.704728 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:41.704747 1701291 retry.go:31] will retry after 8.448093656s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:41.884076 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:41.884161 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:41.884479 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:42.383803 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:42.383879 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:42.384141 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:42.384184 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:42.399541 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:42.476011 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:42.476071 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:42.476093 1701291 retry.go:31] will retry after 9.502945959s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:42.883588 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:42.883671 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:42.884026 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:43.383828 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:43.383907 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:43.384181 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:43.883670 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:43.883743 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:43.884060 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:44.383696 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:44.383771 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:44.384127 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:44.384222 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:44.883976 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:44.884053 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:44.884413 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:45.383648 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:45.383811 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:45.384197 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:45.883998 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:45.884089 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:45.884467 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:46.383603 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:46.383678 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:46.384022 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:46.883619 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:46.883699 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:46.883981 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:46.884038 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:47.383777 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:47.383855 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:47.384200 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:47.883911 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:47.884016 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:47.884384 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:48.383668 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:48.383739 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:48.384087 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:48.883874 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:48.883952 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:48.884283 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:48.884343 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:49.383082 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:49.383173 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:49.383540 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:49.883082 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:49.883151 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:49.883411 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:50.153986 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:19:50.216789 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:50.216837 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:50.216857 1701291 retry.go:31] will retry after 12.027560843s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:50.383117 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:50.383226 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:50.383583 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:50.883287 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:50.883368 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:50.883726 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:51.383622 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:51.383710 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:51.384038 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:51.384100 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:51.883690 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:51.883770 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:51.884105 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:51.979351 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:52.048232 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:52.048287 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:52.048307 1701291 retry.go:31] will retry after 5.922680138s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:52.383846 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:52.383926 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:52.384262 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:52.883642 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:52.883714 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:52.884029 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:53.383844 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:53.383917 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:53.384249 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:53.384309 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:53.884029 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:53.884108 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:53.884493 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:54.383680 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:54.383755 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:54.384008 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:54.883852 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:54.883926 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:54.884262 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:55.384060 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:55.384132 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:55.384467 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:55.384528 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:55.883800 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:55.883874 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:55.884153 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:56.383266 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:56.383344 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:56.383682 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:56.883176 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:56.883258 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:56.883607 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:57.383853 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:57.383936 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:57.384284 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:57.884078 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:57.884157 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:57.884542 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:19:57.884608 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:19:57.972042 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:19:58.032393 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:19:58.036131 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:58.036169 1701291 retry.go:31] will retry after 15.323516146s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:19:58.383700 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:58.383776 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:58.384074 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:58.883637 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:58.883711 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:58.883992 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:59.383767 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:59.383847 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:59.384170 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:19:59.883954 1701291 type.go:168] "Request Body" body=""
	I1124 09:19:59.884029 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:19:59.884364 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:00.386702 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:00.386929 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:00.387350 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:00.387652 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:00.883998 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:00.884089 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:00.884461 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:01.383250 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:01.383328 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:01.383704 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:01.883996 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:01.884068 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:01.884357 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:02.244687 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:20:02.303604 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:20:02.306952 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:02.306992 1701291 retry.go:31] will retry after 20.630907774s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:02.383202 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:02.383281 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:02.383599 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:02.883330 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:02.883410 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:02.883745 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:02.883800 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:03.383311 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:03.383386 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:03.383651 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:03.883196 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:03.883275 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:03.883594 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:04.383175 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:04.383259 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:04.383568 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:04.883120 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:04.883192 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:04.883478 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:05.383217 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:05.383295 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:05.383624 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:05.383680 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:05.883368 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:05.883446 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:05.883773 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:06.383723 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:06.383806 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:06.384068 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:06.883869 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:06.883945 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:06.884264 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:07.384063 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:07.384138 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:07.384462 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:07.384526 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:07.883109 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:07.883188 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:07.883446 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:08.383152 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:08.383224 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:08.383566 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:08.883199 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:08.883279 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:08.883603 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:09.383126 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:09.383212 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:09.383470 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:09.883162 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:09.883254 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:09.883549 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:09.883599 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:10.383190 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:10.383264 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:10.383586 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:10.883817 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:10.883892 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:10.884145 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:11.383273 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:11.383344 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:11.383622 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:11.883313 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:11.883389 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:11.883749 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:11.883806 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:12.383448 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:12.383520 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:12.383791 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:12.883177 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:12.883257 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:12.883572 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:13.360275 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:20:13.383805 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:13.383886 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:13.384154 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:13.423794 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:20:13.423847 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:13.423866 1701291 retry.go:31] will retry after 19.725114159s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:13.884034 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:13.884124 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:13.884430 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:13.884481 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:14.383158 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:14.383258 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:14.383624 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:14.883202 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:14.883284 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:14.883644 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:15.383356 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:15.383435 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:15.383734 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:15.883472 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:15.883549 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:15.883909 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:16.384044 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:16.384118 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:16.384464 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:16.384554 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:16.883205 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:16.883292 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:16.883609 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:17.383212 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:17.383289 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:17.383587 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:17.883207 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:17.883289 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:17.883594 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:18.383758 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:18.383840 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:18.384110 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:18.883984 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:18.884085 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:18.884539 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:18.884620 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:19.383195 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:19.383308 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:19.383679 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:19.883124 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:19.883204 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:19.883474 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:20.383183 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:20.383264 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:20.383612 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:20.883327 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:20.883410 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:20.883750 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:21.383113 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:21.383189 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:21.383447 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:21.383491 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:21.883186 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:21.883258 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:21.883619 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:22.383192 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:22.383277 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:22.383650 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:22.883350 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:22.883422 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:22.883692 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:22.939045 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:20:23.002892 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:20:23.002941 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:23.002963 1701291 retry.go:31] will retry after 24.365576381s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:23.384046 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:23.384125 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:23.384460 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:23.384522 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:23.883216 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:23.883293 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:23.883634 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:24.383833 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:24.383929 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:24.384212 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:24.884088 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:24.884168 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:24.884519 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:25.383227 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:25.383307 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:25.383654 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:25.883912 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:25.883982 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:25.884337 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:25.884396 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:26.383528 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:26.383619 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:26.383952 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:26.883735 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:26.883810 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:26.884149 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:27.383645 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:27.383725 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:27.384079 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:27.883693 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:27.883792 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:27.884080 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:28.383869 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:28.383941 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:28.384276 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:28.384333 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:28.883621 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:28.883696 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:28.884021 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:29.383693 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:29.383768 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:29.384125 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:29.883838 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:29.883920 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:29.884279 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:30.383628 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:30.383705 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:30.383961 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:30.883414 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:30.883492 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:30.883837 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:30.883893 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:31.383689 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:31.383767 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:31.384087 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:31.883629 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:31.883699 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:31.883964 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:32.383819 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:32.383897 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:32.384254 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:32.884067 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:32.884145 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:32.884453 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:32.884504 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:33.149949 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:20:33.204697 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:20:33.208037 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:33.208070 1701291 retry.go:31] will retry after 22.392696015s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:33.383469 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:33.383538 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:33.383796 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:33.883550 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:33.883634 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:33.883947 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:34.383737 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:34.383811 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:34.384171 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:34.883654 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:34.883734 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:34.884066 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:35.383856 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:35.383928 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:35.384271 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:35.384326 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:35.883926 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:35.884005 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:35.884370 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:36.383314 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:36.383384 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:36.383644 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:36.883149 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:36.883225 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:36.883565 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:37.383275 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:37.383359 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:37.383702 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:37.883387 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:37.883466 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:37.883722 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:37.883762 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:38.383178 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:38.383252 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:38.383603 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:38.883170 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:38.883244 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:38.883650 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:39.383205 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:39.383274 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:39.383534 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:39.883192 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:39.883267 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:39.883632 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:40.383376 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:40.383463 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:40.383839 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:40.383896 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:40.883093 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:40.883170 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:40.883479 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:41.383194 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:41.383276 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:41.383635 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:41.883337 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:41.883422 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:41.883716 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:42.383386 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:42.383461 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:42.383814 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:42.883190 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:42.883266 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:42.883601 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:42.883670 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:43.383217 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:43.383293 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:43.383671 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:43.883117 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:43.883198 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:43.883473 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:44.383174 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:44.383255 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:44.383558 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:44.883289 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:44.883363 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:44.883641 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:45.383065 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:45.383134 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:45.383415 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:45.383456 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:45.883191 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:45.883274 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:45.883563 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:46.383411 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:46.383487 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:46.383849 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:46.883402 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:46.883490 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:46.883752 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:47.369539 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:20:47.383080 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:47.383149 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:47.383440 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:47.383498 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:47.426348 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:20:47.429646 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:47.429686 1701291 retry.go:31] will retry after 22.399494886s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1124 09:20:47.883262 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:47.883365 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:47.883699 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:48.383121 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:48.383192 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:48.383450 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:48.883172 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:48.883246 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:48.883565 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:49.383175 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:49.383254 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:49.383619 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:49.383673 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:49.883307 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:49.883381 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:49.883648 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:50.383167 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:50.383242 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:50.383602 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:50.883305 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:50.883403 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:50.883701 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:51.383597 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:51.383671 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:51.383949 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:51.383999 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:51.883799 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:51.883891 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:51.884215 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:52.383953 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:52.384046 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:52.384337 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:52.883622 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:52.883695 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:52.883974 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:53.383750 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:53.383840 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:53.384189 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:53.384246 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:53.883872 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:53.883946 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:53.884278 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:54.383691 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:54.383768 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:54.384062 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:54.883870 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:54.883951 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:54.884279 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:55.384078 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:55.384159 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:55.384531 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:55.384594 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:55.601942 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1124 09:20:55.661064 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:20:55.665031 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:20:55.665156 1701291 out.go:285] ! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1124 09:20:55.883471 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:55.883549 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:55.883839 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:56.384006 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:56.384085 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:56.384438 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:56.883159 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:56.883256 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:56.883546 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:57.383058 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:57.383127 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:57.383401 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:57.883132 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:57.883210 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:57.883522 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:57.883572 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:20:58.383147 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:58.383243 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:58.383538 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:58.883654 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:58.883729 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:58.883987 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:59.383805 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:59.383888 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:59.384179 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:20:59.883985 1701291 type.go:168] "Request Body" body=""
	I1124 09:20:59.884058 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:20:59.884355 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:20:59.884403 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:00.383754 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:00.383833 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:00.384151 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:00.883935 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:00.884016 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:00.884352 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:01.383267 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:01.383344 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:01.383652 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:01.883147 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:01.883222 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:01.883556 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:02.383255 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:02.383332 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:02.383663 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:02.383721 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:02.883448 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:02.883530 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:02.883895 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:03.383623 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:03.383692 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:03.383959 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:03.883727 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:03.883833 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:03.884183 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:04.383989 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:04.384068 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:04.384431 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:04.384491 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:04.883658 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:04.883737 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:04.884051 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:05.383792 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:05.383863 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:05.384221 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:05.883873 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:05.883951 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:05.884288 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:06.383270 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:06.383343 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:06.383618 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:06.883169 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:06.883268 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:06.883573 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:06.883620 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:07.383342 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:07.383427 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:07.383765 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:07.884027 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:07.884094 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:07.884425 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:08.383164 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:08.383239 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:08.383598 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:08.883308 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:08.883398 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:08.883741 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:08.883802 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:09.383098 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:09.383166 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:09.383423 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:09.830147 1701291 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1124 09:21:09.883707 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:09.883815 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:09.884234 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:09.887265 1701291 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:21:09.890761 1701291 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1124 09:21:09.890861 1701291 out.go:285] ! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1124 09:21:09.895836 1701291 out.go:179] * Enabled addons: 
	I1124 09:21:09.899594 1701291 addons.go:530] duration metric: took 1m43.548488453s for enable addons: enabled=[]
	I1124 09:21:10.383381 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:10.383468 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:10.383851 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:10.883541 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:10.883612 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:10.883871 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:10.883921 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:11.383721 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:11.383804 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:11.384146 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:11.883758 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:11.883832 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:11.884153 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:12.383650 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:12.383725 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:12.383994 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:12.883791 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:12.883869 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:12.884200 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:12.884259 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:13.384051 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:13.384130 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:13.384481 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:13.883069 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:13.883147 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:13.883443 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:14.383158 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:14.383256 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:14.383600 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:14.883308 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:14.883386 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:14.883743 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:15.383457 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:15.383524 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:15.383790 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:15.383833 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:15.883160 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:15.883235 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:15.883570 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:16.383347 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:16.383428 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:16.383759 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:16.883325 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:16.883399 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:16.883664 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:17.383191 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:17.383290 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:17.383661 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:17.883228 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:17.883306 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:17.883672 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:17.883730 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:18.383978 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:18.384061 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:18.384373 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:18.883112 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:18.883209 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:18.883544 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:19.383112 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:19.383198 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:19.383544 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:19.883660 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:19.883735 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:19.883994 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:19.884034 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:20.383840 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:20.383939 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:20.384276 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:20.884063 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:20.884139 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:20.884609 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:21.383122 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:21.383191 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:21.383474 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:21.883229 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:21.883311 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:21.883643 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:22.383182 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:22.383259 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:22.383610 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:22.383663 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:22.884007 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:22.884077 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:22.884343 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:23.384148 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:23.384238 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:23.384581 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:23.883132 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:23.883207 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:23.883554 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:24.383088 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:24.383159 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:24.383481 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:24.883179 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:24.883275 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:24.883610 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:24.883675 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:25.383186 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:25.383268 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:25.383608 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:25.883472 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:25.883586 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:25.884129 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:26.383146 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:26.383230 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:26.383577 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:26.883188 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:26.883299 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:26.883678 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:26.883758 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:27.384111 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:27.384181 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:27.384491 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:27.883092 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:27.883171 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:27.883515 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:28.383158 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:28.383240 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:28.383623 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:28.883309 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:28.883385 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:28.883717 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:29.383191 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:29.383266 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:29.383650 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:29.383702 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:29.883188 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:29.883275 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:29.883613 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:30.383126 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:30.383193 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:30.383490 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:30.883171 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:30.883254 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:30.883605 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:31.383171 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:31.383250 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:31.383601 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:31.883122 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:31.883191 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:31.883449 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:31.883489 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:32.383217 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:32.383291 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:32.383620 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:32.883183 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:32.883260 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:32.883629 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:33.383180 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:33.383262 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:33.383549 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:33.883190 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:33.883271 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:33.883638 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:33.883694 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:34.383256 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:34.383337 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:34.383680 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:34.883132 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:34.883214 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:34.883526 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:35.383172 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:35.383252 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:35.383615 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:35.883199 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:35.883282 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:35.883582 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:36.383281 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:36.383351 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:36.383609 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:36.383650 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:36.883304 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:36.883386 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:36.883706 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:37.383434 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:37.383512 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:37.383858 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:37.883555 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:37.883635 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:37.883920 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:38.383713 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:38.383800 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:38.384150 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:38.384211 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:38.884005 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:38.884085 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:38.884432 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:39.383117 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:39.383189 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:39.383470 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:39.883210 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:39.883320 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:39.883681 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:40.383192 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:40.383273 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:40.383648 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:40.883329 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:40.883413 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:40.883677 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:40.883719 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:41.383810 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:41.383891 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:41.384260 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:41.884110 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:41.884211 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:41.884610 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:42.383111 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:42.383184 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:42.383469 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:42.883146 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:42.883219 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:42.883556 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:43.383303 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:43.383390 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:43.383815 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:43.383880 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:43.884150 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:43.884225 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:43.884489 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:44.383187 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:44.383285 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:44.383631 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:44.883347 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:44.883424 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:44.883787 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:45.383143 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:45.383221 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:45.383485 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:45.883220 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:45.883291 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:45.883631 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:45.883683 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:46.383565 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:46.383643 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:46.384005 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:46.883681 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:46.883753 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:46.884095 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:47.383931 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:47.384032 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:47.384438 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:47.884098 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:47.884173 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:47.884475 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:47.884521 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:48.383141 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:48.383214 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:48.383504 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:48.883189 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:48.883295 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:48.883641 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:49.383237 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:49.383316 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:49.383651 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:49.883138 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:49.883214 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:49.883514 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:50.383163 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:50.383242 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:50.383592 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:50.383651 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:50.883194 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:50.883284 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:50.883599 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:51.383074 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:51.383155 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:51.383436 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:51.883141 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:51.883231 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:51.883582 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:52.383155 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:52.383242 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:52.383575 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:52.883252 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:52.883327 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:52.883595 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:52.883642 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:53.383321 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:53.383392 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:53.383737 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:53.883216 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:53.883293 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:53.883646 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:54.383339 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:54.383413 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:54.383688 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:54.883201 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:54.883274 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:54.883590 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:55.383288 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:55.383366 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:55.383704 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:55.383769 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:55.883435 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:55.883505 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:55.883816 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:56.384009 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:56.384088 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:56.384422 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:56.883137 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:56.883222 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:56.883558 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:57.383818 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:57.383897 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:57.384172 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:57.384212 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:21:57.883977 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:57.884053 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:57.884399 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:58.383153 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:58.383233 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:58.383556 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:58.883101 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:58.883177 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:58.883433 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:59.383157 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:59.383236 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:59.383565 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:21:59.883168 1701291 type.go:168] "Request Body" body=""
	I1124 09:21:59.883258 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:21:59.883650 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:21:59.883705 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:00.383305 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:00.383386 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:00.383837 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:00.883166 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:00.883245 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:00.883577 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:01.383186 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:01.383267 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:01.383606 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:01.883853 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:01.883923 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:01.884206 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:01.884257 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:02.384016 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:02.384095 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:02.384455 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:02.884100 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:02.884181 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:02.884522 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:03.383131 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:03.383207 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:03.383521 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:03.883200 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:03.883287 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:03.883604 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:04.383190 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:04.383274 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:04.383643 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:04.383702 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:04.883207 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:04.883279 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:04.883551 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:05.383188 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:05.383272 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:05.383607 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:05.883177 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:05.883257 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:05.883627 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:06.383792 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:06.383879 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:06.384240 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:06.384291 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:06.884040 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:06.884120 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:06.884445 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:07.383154 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:07.383230 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:07.383566 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:07.883872 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:07.883944 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:07.884212 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:08.383967 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:08.384042 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:08.384363 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:08.384428 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:08.883105 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:08.883184 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:08.883520 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:09.383654 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:09.383727 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:09.384039 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:09.883710 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:09.883788 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:09.884141 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:10.383940 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:10.384022 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:10.384358 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:10.883643 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:10.883717 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:10.883979 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:10.884026 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:11.384043 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:11.384119 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:11.384476 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:11.884107 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:11.884182 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:11.884497 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:12.383080 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:12.383156 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:12.383420 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:12.883114 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:12.883197 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:12.883546 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:13.383148 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:13.383235 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:13.383567 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:13.383626 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:13.883128 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:13.883206 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:13.883519 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:14.383161 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:14.383237 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:14.383589 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:14.883306 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:14.883385 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:14.883735 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:15.384005 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:15.384082 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:15.384357 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:15.384407 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:15.883074 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:15.883147 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:15.883531 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:16.383351 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:16.383433 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:16.383810 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:16.883361 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:16.883437 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:16.883741 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:17.383154 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:17.383240 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:17.383580 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:17.883164 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:17.883241 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:17.883543 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:17.883590 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:18.383110 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:18.383195 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:18.383512 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:18.883210 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:18.883284 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:18.883632 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:19.383181 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:19.383265 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:19.383589 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:19.883083 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:19.883153 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:19.883418 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:20.383720 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:20.383806 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:20.384138 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:20.384189 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:20.883895 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:20.883977 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:20.884383 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:21.383110 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:21.383179 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:21.383449 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:21.883148 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:21.883222 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:21.883554 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:22.383181 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:22.383267 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:22.383745 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:22.883438 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:22.883512 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:22.883827 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:22.883878 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:23.383219 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:23.383300 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:23.383650 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:23.883352 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:23.883439 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:23.883739 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:24.383094 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:24.383172 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:24.383441 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:24.883170 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:24.883246 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:24.883573 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:25.383180 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:25.383258 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:25.383557 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:25.383602 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:25.883124 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:25.883200 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:25.883530 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:26.383418 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:26.383502 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:26.383820 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:26.883156 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:26.883232 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:26.883574 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:27.383253 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:27.383325 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:27.383640 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:27.383695 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:27.883229 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:27.883308 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:27.883663 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:28.383352 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:28.383428 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:28.383771 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:28.883152 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:28.883224 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:28.883533 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:29.383259 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:29.383346 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:29.383718 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:29.383781 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:29.883468 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:29.883551 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:29.883860 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:30.383098 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:30.383174 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:30.383431 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:30.883167 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:30.883258 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:30.883626 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:31.383185 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:31.383269 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:31.383600 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:31.883135 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:31.883200 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:31.883477 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:31.883524 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:32.383251 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:32.383334 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:32.383667 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:32.883186 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:32.883294 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:32.883590 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:33.383124 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:33.383196 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:33.383537 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:33.883238 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:33.883319 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:33.883668 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:33.883725 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:34.383411 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:34.383500 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:34.383842 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:34.883113 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:34.883201 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:34.883459 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:35.383178 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:35.383251 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:35.383573 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:35.883163 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:35.883245 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:35.883570 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:36.383743 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:36.383821 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:36.384077 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:36.384116 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:36.883871 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:36.883954 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:36.884285 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:37.384043 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:37.384116 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:37.384446 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:37.883126 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:37.883195 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:37.883464 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:38.383151 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:38.383233 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:38.383571 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:38.883272 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:38.883352 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:38.883655 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:38.883702 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:39.383349 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:39.383416 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:39.383686 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:39.883194 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:39.883268 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:39.883616 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:40.383355 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:40.383439 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:40.383825 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:40.883050 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:40.883119 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:40.883381 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:41.383178 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:41.383263 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:41.383602 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:41.383659 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:41.883334 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:41.883418 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:41.883737 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:42.383098 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:42.383164 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:42.383505 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:42.883181 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:42.883256 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:42.883607 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:43.383328 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:43.383407 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:43.383779 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:43.383848 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:43.883040 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:43.883108 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:43.883373 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:44.383062 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:44.383137 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:44.383488 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:44.883210 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:44.883294 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:44.883624 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:45.383177 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:45.383283 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:45.383549 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:45.883285 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:45.883371 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:45.883679 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:45.883730 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:46.383910 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:46.383988 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:46.384338 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:46.883634 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:46.883708 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:46.883972 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:47.383794 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:47.383890 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:47.384333 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:47.883084 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:47.883172 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:47.883512 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:48.383207 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:48.383278 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:48.383553 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:48.383599 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:48.883146 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:48.883219 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:48.883545 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:49.383190 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:49.383267 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:49.383618 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:49.883863 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:49.883935 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:49.884201 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:50.384017 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:50.384095 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:50.384461 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:50.384517 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:50.883189 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:50.883269 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:50.883636 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:51.383067 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:51.383134 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:51.383393 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:51.883095 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:51.883170 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:51.883486 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:52.383088 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:52.383168 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:52.383503 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:52.883649 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:52.883715 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:52.883972 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:52.884013 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:53.383510 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:53.383586 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:53.383942 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:53.883728 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:53.883810 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:53.884186 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:54.383720 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:54.383800 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:54.384075 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:54.883881 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:54.883959 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:54.884315 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:54.884374 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:55.383090 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:55.383174 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:55.383511 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:55.883189 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:55.883267 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:55.883536 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:56.383980 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:56.384072 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:56.384430 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:56.883181 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:56.883264 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:56.883592 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:57.383208 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:57.383283 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:57.383562 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:57.383632 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:22:57.883178 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:57.883262 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:57.883557 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:58.383270 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:58.383366 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:58.383681 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:58.883065 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:58.883142 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:58.883409 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:59.383139 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:59.383281 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:59.383599 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:22:59.883295 1701291 type.go:168] "Request Body" body=""
	I1124 09:22:59.883368 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:22:59.883709 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:22:59.883767 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:00.383455 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:00.383535 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:00.383834 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:00.883728 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:00.883804 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:00.884143 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:01.383926 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:01.384011 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:01.384371 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:01.883658 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:01.883732 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:01.884049 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:01.884099 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:02.383854 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:02.383936 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:02.384276 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:02.883965 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:02.884044 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:02.884421 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:03.383112 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:03.383191 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:03.383460 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:03.883214 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:03.883310 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:03.883701 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:04.383439 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:04.383520 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:04.383845 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:04.383903 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:04.883132 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:04.883203 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:04.883548 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:05.383170 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:05.383248 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:05.383591 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:05.883308 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:05.883386 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:05.883685 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:06.383882 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:06.383959 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:06.384277 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:06.384350 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:06.884102 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:06.884178 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:06.884513 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:07.383085 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:07.383184 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:07.383543 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:07.883883 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:07.883956 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:07.884221 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:08.384050 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:08.384123 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:08.384452 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:08.384509 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:08.883183 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:08.883259 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:08.883584 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:09.383246 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:09.383318 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:09.383640 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:09.883219 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:09.883299 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:09.883648 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:10.383378 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:10.383458 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:10.383753 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:10.883317 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:10.883388 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:10.883680 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:10.883723 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:11.383702 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:11.383803 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:11.384131 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:11.883721 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:11.883799 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:11.884129 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:12.383663 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:12.383738 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:12.384067 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:12.883854 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:12.883940 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:12.884274 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:12.884334 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:13.384116 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:13.384195 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:13.384538 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:13.883793 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:13.883869 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:13.884135 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:14.383911 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:14.383994 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:14.384297 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:14.883958 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:14.884048 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:14.884401 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:14.884456 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:15.383622 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:15.383700 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:15.383974 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:15.883699 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:15.883778 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:15.884117 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:16.384146 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:16.384226 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:16.384578 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:16.883198 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:16.883271 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:16.883565 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:17.383190 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:17.383273 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:17.383627 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:17.383682 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:17.883355 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:17.883436 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:17.883756 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:18.383116 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:18.383185 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:18.383441 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:18.883189 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:18.883268 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:18.883596 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:19.383187 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:19.383269 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:19.383630 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:19.883313 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:19.883385 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:19.883674 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:19.883725 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:20.383174 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:20.383257 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:20.383614 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:20.883340 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:20.883415 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:20.883771 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:21.383646 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:21.383720 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:21.383985 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:21.883845 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:21.883928 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:21.884313 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:21.884368 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:22.383054 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:22.383138 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:22.383471 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:22.883086 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:22.883170 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:22.883470 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:23.383158 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:23.383233 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:23.383562 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:23.883247 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:23.883321 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:23.883637 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:24.383095 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:24.383165 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:24.383431 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:24.383471 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:24.883124 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:24.883205 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:24.883534 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:25.383173 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:25.383251 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:25.383575 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:25.883126 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:25.883196 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:25.883470 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:26.383072 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:26.383157 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:26.383507 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:26.383568 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:26.883510 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:26.883587 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:26.883957 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:27.383649 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:27.383724 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:27.384102 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:27.883942 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:27.884029 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:27.884418 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:28.383174 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:28.383254 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:28.383611 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:28.383665 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:28.883114 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:28.883191 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:28.883456 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:29.383167 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:29.383248 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:29.383597 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:29.883182 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:29.883258 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:29.883579 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:30.383122 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:30.383199 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:30.383527 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:30.883137 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:30.883215 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:30.883514 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:30.883561 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:31.383185 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:31.383260 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:31.383609 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:31.883900 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:31.883970 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:31.884278 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:32.384086 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:32.384160 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:32.384455 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:32.883182 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:32.883259 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:32.883600 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:32.883657 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:33.383111 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:33.383190 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:33.383455 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:33.883174 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:33.883260 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:33.883641 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:34.383362 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:34.383442 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:34.383802 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:34.883106 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:34.883183 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:34.883439 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:35.383143 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:35.383220 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:35.383551 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:35.383604 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:35.883151 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:35.883229 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:35.883562 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:36.383293 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:36.383366 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:36.383619 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:36.883173 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:36.883255 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:36.883580 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:37.383161 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:37.383237 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:37.383584 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:37.383635 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:37.883107 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:37.883182 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:37.883504 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:38.383163 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:38.383248 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:38.383593 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:38.883178 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:38.883257 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:38.883615 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:39.383868 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:39.383940 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:39.384210 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:39.384251 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:39.883999 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:39.884075 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:39.884422 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:40.383152 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:40.383231 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:40.383560 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:40.883278 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:40.883355 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:40.883619 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:41.383462 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:41.383549 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:41.383883 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:41.883473 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:41.883550 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:41.883893 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:41.883952 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:42.383654 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:42.383728 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:42.384013 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:42.883799 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:42.883875 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:42.884236 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:43.384072 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:43.384157 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:43.384486 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:43.883149 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:43.883222 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:43.883524 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:44.383233 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:44.383315 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:44.383652 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:44.383713 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:44.883155 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:44.883243 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:44.883579 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:45.383126 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:45.383203 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:45.383524 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:45.883188 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:45.883264 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:45.883628 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:46.383346 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:46.383429 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:46.383765 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:46.383819 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:46.883878 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:46.883951 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:46.884224 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:47.384060 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:47.384136 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:47.384469 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:47.883170 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:47.883249 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:47.883589 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:48.383128 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:48.383211 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:48.383474 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:48.883184 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:48.883264 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:48.883602 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:48.883663 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:49.383164 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:49.383241 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:49.383569 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:49.883290 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:49.883367 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:49.883671 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:50.383185 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:50.383268 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:50.383648 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:50.883431 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:50.883514 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:50.883850 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:50.883909 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:51.383652 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:51.383720 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:51.383978 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:51.883444 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:51.883523 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:51.883866 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:52.383586 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:52.383680 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:52.384026 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:52.883655 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:52.883728 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:52.884053 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:52.884105 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:53.383855 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:53.383945 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:53.384271 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:53.884101 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:53.884186 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:53.884529 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:54.383101 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:54.383176 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:54.383443 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:54.883148 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:54.883222 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:54.883575 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:55.383191 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:55.383270 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:55.383608 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:55.383664 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:55.883870 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:55.883946 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:55.884289 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:56.383256 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:56.383351 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:56.383747 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:56.883462 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:56.883538 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:56.883871 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:57.383563 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:57.383638 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:57.383899 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:57.383944 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:23:57.883683 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:57.883768 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:57.884147 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:58.383932 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:58.384008 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:58.384395 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:58.883091 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:58.883159 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:58.883412 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:59.383084 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:59.383166 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:59.383498 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:23:59.883178 1701291 type.go:168] "Request Body" body=""
	I1124 09:23:59.883259 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:23:59.883595 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:23:59.883655 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:00.392124 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:00.392210 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:00.392556 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:00.883180 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:00.883282 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:00.883653 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:01.383190 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:01.383267 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:01.383567 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:01.883245 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:01.883313 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:01.883583 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:02.383189 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:02.383267 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:02.383605 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:02.383667 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:02.883233 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:02.883317 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:02.883620 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:03.383856 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:03.383927 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:03.384185 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:03.884056 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:03.884135 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:03.884494 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:04.383223 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:04.383311 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:04.383613 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:04.883276 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:04.883345 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:04.883599 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:04.883643 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:05.383163 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:05.383239 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:05.383541 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:05.883213 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:05.883295 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:05.883634 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:06.383304 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:06.383375 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:06.383679 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:06.883401 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:06.883483 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:06.883806 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:06.883865 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:07.383190 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:07.383271 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:07.383648 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:07.883325 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:07.883398 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:07.883710 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:08.383188 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:08.383266 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:08.383566 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:08.883267 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:08.883345 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:08.883690 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:09.384042 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:09.384118 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:09.384458 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:09.384510 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:09.883180 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:09.883253 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:09.883573 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:10.383180 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:10.383261 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:10.383583 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:10.883127 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:10.883204 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:10.883474 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:11.383167 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:11.383240 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:11.383552 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:11.883157 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:11.883234 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:11.883563 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:11.883618 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:12.383084 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:12.383153 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:12.383411 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:12.883181 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:12.883256 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:12.883591 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:13.383188 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:13.383265 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:13.383586 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:13.883123 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:13.883204 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:13.883485 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:14.383559 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:14.383638 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:14.383953 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:14.384012 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:14.883792 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:14.883868 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:14.884213 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:15.383592 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:15.383667 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:15.383925 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:15.883767 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:15.883843 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:15.884202 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:16.383348 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:16.383420 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:16.383758 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:16.883457 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:16.883538 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:16.883795 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:16.883837 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:17.383161 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:17.383238 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:17.383573 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:17.883186 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:17.883271 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:17.883611 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:18.383302 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:18.383377 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:18.383637 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:18.883191 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:18.883269 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:18.883626 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:19.383147 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:19.383224 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:19.383554 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:19.383616 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:19.883116 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:19.883185 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:19.883449 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:20.383135 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:20.383213 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:20.383531 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:20.883180 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:20.883260 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:20.883559 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:21.383124 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:21.383200 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:21.383460 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:21.883132 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:21.883213 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:21.883553 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:21.883608 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:22.383153 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:22.383241 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:22.383543 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:22.883110 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:22.883179 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:22.883439 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:23.383165 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:23.383246 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:23.383640 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:23.883372 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:23.883448 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:23.883789 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:23.883846 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:24.383112 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:24.383188 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:24.383507 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:24.883200 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:24.883275 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:24.883561 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:25.383261 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:25.383336 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:25.383674 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:25.883358 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:25.883437 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:25.883749 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:26.383290 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:26.383368 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:26.383724 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:26.383783 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:26.883478 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:26.883555 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:26.883888 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:27.383604 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:27.383677 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:27.383939 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:27.883757 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:27.883845 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:27.884167 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:28.383852 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:28.383928 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:28.384269 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:28.384325 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:28.883626 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:28.883692 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:28.883958 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:29.383717 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:29.383796 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:29.384139 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:29.883960 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:29.884036 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:29.884369 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:30.383625 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:30.383694 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:30.383980 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:30.883744 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:30.883816 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:30.884150 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:30.884205 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:31.383977 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:31.384060 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:31.384393 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:31.883624 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:31.883716 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:31.883977 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:32.383741 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:32.383814 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:32.384155 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:32.883968 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:32.884055 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:32.884386 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:32.884443 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:33.383735 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:33.383805 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:33.384072 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:33.883915 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:33.883991 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:33.884369 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:34.383120 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:34.383204 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:34.383566 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:34.883846 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:34.883924 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:34.884224 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:35.383982 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:35.384056 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:35.384427 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:35.384483 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:35.884112 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:35.884192 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:35.884530 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:36.383425 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:36.383499 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:36.383766 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:36.883482 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:36.883565 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:36.883947 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:37.383742 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:37.383819 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:37.384158 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:37.883707 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:37.883774 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:37.884034 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:37.884074 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:38.383850 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:38.383958 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:38.384324 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:38.883075 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:38.883152 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:38.883501 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:39.383112 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:39.383186 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:39.383448 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:39.883223 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:39.883319 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:39.883638 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:40.383373 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:40.383445 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:40.383734 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:40.383787 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:40.883050 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:40.883126 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:40.883428 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:41.383217 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:41.383293 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:41.383634 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:41.883211 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:41.883294 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:41.883578 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:42.383069 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:42.383136 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:42.383390 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:42.883177 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:42.883268 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:42.883564 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:42.883610 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:43.383316 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:43.383402 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:43.383752 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:43.884076 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:43.884150 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:43.884466 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:44.383187 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:44.383282 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:44.383645 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:44.883389 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:44.883464 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:44.883804 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:44.883864 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:45.383117 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:45.383195 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:45.383502 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:45.883172 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:45.883255 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:45.883601 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:46.383362 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:46.383444 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:46.383798 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:46.883132 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:46.883204 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:46.883525 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:47.383245 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:47.383343 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:47.383724 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:47.383787 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:47.883322 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:47.883396 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:47.883705 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:48.383414 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:48.383490 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:48.383778 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:48.883192 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:48.883270 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:48.883613 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:49.383457 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:49.383533 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:49.383864 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:49.383922 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:49.883061 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:49.883134 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:49.883396 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:50.383133 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:50.383215 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:50.383592 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:50.883345 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:50.883424 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:50.883767 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:51.383609 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:51.383687 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:51.383946 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:51.383994 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:51.883714 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:51.883789 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:51.884128 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:52.383943 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:52.384028 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:52.384399 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:52.883710 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:52.883786 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:52.884049 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:53.383826 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:53.383902 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:53.384299 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:53.384353 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:53.883075 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:53.883154 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:53.883549 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:54.383241 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:54.383316 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:54.383579 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:54.883201 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:54.883281 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:54.883627 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:55.383208 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:55.383284 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:55.383613 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:55.883300 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:55.883372 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:55.883626 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:55.883666 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:56.383898 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:56.383987 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:56.384342 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:56.883076 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:56.883152 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:56.883529 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:57.383840 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:57.383919 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:57.384396 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:57.883127 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:57.883220 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:57.883601 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:58.383185 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:58.383263 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:58.383601 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:24:58.383658 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:24:58.883103 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:58.883174 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:58.883430 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:59.383161 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:59.383239 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:59.383528 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:24:59.883235 1701291 type.go:168] "Request Body" body=""
	I1124 09:24:59.883319 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:24:59.883655 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:00.383085 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:00.383175 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:00.383480 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:00.883287 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:00.883398 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:00.883768 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:00.883828 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:01.383727 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:01.383809 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:01.384138 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:01.883728 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:01.883797 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:01.884120 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:02.383919 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:02.383991 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:02.384291 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:02.884049 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:02.884120 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:02.884420 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:02.884485 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:03.383819 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:03.383888 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:03.384209 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:03.884004 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:03.884091 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:03.884451 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:04.384095 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:04.384179 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:04.384501 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:04.883234 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:04.883303 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:04.883584 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:05.383144 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:05.383220 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:05.383542 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:05.383601 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:05.883200 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:05.883285 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:05.883658 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:06.383258 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:06.383334 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:06.383660 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:06.883258 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:06.883336 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:06.883680 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:07.383404 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:07.383485 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:07.383858 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:07.383913 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:07.883619 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:07.883699 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:07.883964 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:08.383741 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:08.383818 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:08.384168 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:08.883989 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:08.884068 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:08.884393 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:09.384092 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:09.384163 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:09.384427 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:09.384467 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:09.883175 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:09.883251 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:09.883564 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:10.383182 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:10.383261 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:10.383593 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:10.883126 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:10.883201 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:10.883461 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:11.383179 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:11.383273 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:11.383596 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:11.883182 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:11.883257 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:11.883609 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:11.883665 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:12.383318 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:12.383399 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:12.383715 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:12.883177 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:12.883251 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:12.883592 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:13.383299 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:13.383377 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:13.383726 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:13.883393 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:13.883461 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:13.883721 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:13.883763 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:14.383180 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:14.383258 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:14.383605 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:14.883184 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:14.883272 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:14.883660 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:15.383351 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:15.383434 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:15.383700 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:15.883201 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:15.883305 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:15.883711 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:16.383360 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:16.383441 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:16.383809 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:16.383867 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:16.883068 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:16.883136 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:16.883406 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:17.383093 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:17.383175 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:17.383513 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:17.883239 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:17.883322 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:17.883695 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:18.383395 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:18.383464 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:18.383742 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:18.883187 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:18.883264 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:18.883645 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:18.883700 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:19.383197 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:19.383275 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:19.383625 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:19.883335 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:19.883406 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:19.883791 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:20.383200 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:20.383305 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:20.383704 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:20.883416 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:20.883493 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:20.883891 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:20.883947 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:21.383661 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:21.383731 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:21.383987 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:21.883737 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:21.883815 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:21.884385 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:22.383106 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:22.383198 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:22.383565 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:22.883149 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:22.883216 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:22.883512 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:23.383192 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:23.383276 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:23.383626 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:23.383680 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:23.883354 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:23.883454 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:23.883802 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:24.383131 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:24.383205 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:24.383521 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:24.883214 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:24.883298 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:24.883675 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:25.383244 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:25.383317 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:25.383636 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:25.883064 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:25.883143 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:25.883420 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1124 09:25:25.883474 1701291 node_ready.go:55] error getting node "functional-291288" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-291288": dial tcp 192.168.49.2:8441: connect: connection refused
	I1124 09:25:26.383164 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:26.383254 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:26.383617 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:26.883333 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:26.883410 1701291 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-291288" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1124 09:25:26.883740 1701291 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1124 09:25:27.383124 1701291 type.go:168] "Request Body" body=""
	I1124 09:25:27.383182 1701291 node_ready.go:38] duration metric: took 6m0.000242478s for node "functional-291288" to be "Ready" ...
	I1124 09:25:27.386338 1701291 out.go:203] 
	W1124 09:25:27.389204 1701291 out.go:285] X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	W1124 09:25:27.389224 1701291 out.go:285] * 
	W1124 09:25:27.391374 1701291 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1124 09:25:27.394404 1701291 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Nov 24 09:25:34 functional-291288 containerd[5880]: time="2025-11-24T09:25:34.653424959Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.3\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:25:35 functional-291288 containerd[5880]: time="2025-11-24T09:25:35.680892289Z" level=info msg="No images store for sha256:a1f83055284ec302ac691d8677946d8b4e772fb7071d39ada1cc9184cb70814b"
	Nov 24 09:25:35 functional-291288 containerd[5880]: time="2025-11-24T09:25:35.683044385Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:latest\""
	Nov 24 09:25:35 functional-291288 containerd[5880]: time="2025-11-24T09:25:35.689835871Z" level=info msg="ImageCreate event name:\"sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:25:35 functional-291288 containerd[5880]: time="2025-11-24T09:25:35.690177864Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:latest\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:25:36 functional-291288 containerd[5880]: time="2025-11-24T09:25:36.659460561Z" level=info msg="No images store for sha256:f22c82f26787d6279eede4444a6bf746ad608345b849f71f03d60b8e0589531e"
	Nov 24 09:25:36 functional-291288 containerd[5880]: time="2025-11-24T09:25:36.661620279Z" level=info msg="ImageCreate event name:\"docker.io/library/minikube-local-cache-test:functional-291288\""
	Nov 24 09:25:36 functional-291288 containerd[5880]: time="2025-11-24T09:25:36.668726508Z" level=info msg="ImageCreate event name:\"sha256:cee23f3226e286bed41a2ff2b5fad8e6e395a2934880a19d2a902b07140a4221\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:25:36 functional-291288 containerd[5880]: time="2025-11-24T09:25:36.669034770Z" level=info msg="ImageUpdate event name:\"docker.io/library/minikube-local-cache-test:functional-291288\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:25:37 functional-291288 containerd[5880]: time="2025-11-24T09:25:37.473140767Z" level=info msg="RemoveImage \"registry.k8s.io/pause:latest\""
	Nov 24 09:25:37 functional-291288 containerd[5880]: time="2025-11-24T09:25:37.475635774Z" level=info msg="ImageDelete event name:\"registry.k8s.io/pause:latest\""
	Nov 24 09:25:37 functional-291288 containerd[5880]: time="2025-11-24T09:25:37.477631216Z" level=info msg="ImageDelete event name:\"sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a\""
	Nov 24 09:25:37 functional-291288 containerd[5880]: time="2025-11-24T09:25:37.491716240Z" level=info msg="RemoveImage \"registry.k8s.io/pause:latest\" returns successfully"
	Nov 24 09:25:38 functional-291288 containerd[5880]: time="2025-11-24T09:25:38.419124966Z" level=info msg="RemoveImage \"registry.k8s.io/pause:3.1\""
	Nov 24 09:25:38 functional-291288 containerd[5880]: time="2025-11-24T09:25:38.421484178Z" level=info msg="ImageDelete event name:\"registry.k8s.io/pause:3.1\""
	Nov 24 09:25:38 functional-291288 containerd[5880]: time="2025-11-24T09:25:38.423519611Z" level=info msg="ImageDelete event name:\"sha256:8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5\""
	Nov 24 09:25:38 functional-291288 containerd[5880]: time="2025-11-24T09:25:38.431745023Z" level=info msg="RemoveImage \"registry.k8s.io/pause:3.1\" returns successfully"
	Nov 24 09:25:38 functional-291288 containerd[5880]: time="2025-11-24T09:25:38.591084144Z" level=info msg="No images store for sha256:a1f83055284ec302ac691d8677946d8b4e772fb7071d39ada1cc9184cb70814b"
	Nov 24 09:25:38 functional-291288 containerd[5880]: time="2025-11-24T09:25:38.593237060Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:latest\""
	Nov 24 09:25:38 functional-291288 containerd[5880]: time="2025-11-24T09:25:38.600078138Z" level=info msg="ImageCreate event name:\"sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:25:38 functional-291288 containerd[5880]: time="2025-11-24T09:25:38.601791057Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:latest\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:25:38 functional-291288 containerd[5880]: time="2025-11-24T09:25:38.715557551Z" level=info msg="No images store for sha256:3ac89611d5efd8eb74174b1f04c33b7e73b651cec35b5498caf0cfdd2efd7d48"
	Nov 24 09:25:38 functional-291288 containerd[5880]: time="2025-11-24T09:25:38.717850948Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.1\""
	Nov 24 09:25:38 functional-291288 containerd[5880]: time="2025-11-24T09:25:38.724559939Z" level=info msg="ImageCreate event name:\"sha256:8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:25:38 functional-291288 containerd[5880]: time="2025-11-24T09:25:38.724882822Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.1\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:25:42.834962    9983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:25:42.835314    9983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:25:42.836928    9983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:25:42.837816    9983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:25:42.839736    9983 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Nov24 08:20] overlayfs: idmapped layers are currently not supported
	[Nov24 08:21] overlayfs: idmapped layers are currently not supported
	[Nov24 08:22] overlayfs: idmapped layers are currently not supported
	[Nov24 08:23] overlayfs: idmapped layers are currently not supported
	[Nov24 08:24] overlayfs: idmapped layers are currently not supported
	[ +19.566509] overlayfs: idmapped layers are currently not supported
	[Nov24 08:25] overlayfs: idmapped layers are currently not supported
	[ +35.934095] overlayfs: idmapped layers are currently not supported
	[Nov24 08:27] overlayfs: idmapped layers are currently not supported
	[Nov24 08:28] overlayfs: idmapped layers are currently not supported
	[Nov24 08:29] overlayfs: idmapped layers are currently not supported
	[Nov24 08:30] overlayfs: idmapped layers are currently not supported
	[Nov24 08:31] overlayfs: idmapped layers are currently not supported
	[ +44.505180] overlayfs: idmapped layers are currently not supported
	[Nov24 08:32] overlayfs: idmapped layers are currently not supported
	[Nov24 08:33] overlayfs: idmapped layers are currently not supported
	[Nov24 08:34] overlayfs: idmapped layers are currently not supported
	[ +21.137877] overlayfs: idmapped layers are currently not supported
	[ +28.724175] overlayfs: idmapped layers are currently not supported
	[Nov24 08:36] overlayfs: idmapped layers are currently not supported
	[Nov24 08:37] overlayfs: idmapped layers are currently not supported
	[Nov24 08:38] overlayfs: idmapped layers are currently not supported
	[  +4.063834] overlayfs: idmapped layers are currently not supported
	[Nov24 08:40] overlayfs: idmapped layers are currently not supported
	[Nov24 08:42] kauditd_printk_skb: 8 callbacks suppressed
	
	
	==> kernel <==
	 09:25:42 up  8:07,  0 user,  load average: 0.56, 0.31, 0.50
	Linux functional-291288 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Nov 24 09:25:39 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:25:40 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 827.
	Nov 24 09:25:40 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:25:40 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:25:40 functional-291288 kubelet[9767]: E1124 09:25:40.191081    9767 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:25:40 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:25:40 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:25:40 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 828.
	Nov 24 09:25:40 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:25:40 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:25:40 functional-291288 kubelet[9858]: E1124 09:25:40.955548    9858 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:25:40 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:25:40 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:25:41 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 829.
	Nov 24 09:25:41 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:25:41 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:25:41 functional-291288 kubelet[9878]: E1124 09:25:41.694893    9878 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:25:41 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:25:41 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:25:42 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 830.
	Nov 24 09:25:42 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:25:42 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:25:42 functional-291288 kubelet[9899]: E1124 09:25:42.441792    9899 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:25:42 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:25:42 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-291288 -n functional-291288
helpers_test.go:262: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-291288 -n functional-291288: exit status 2 (377.044391ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:262: status error: exit status 2 (may be ok)
helpers_test.go:264: "functional-291288" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmdDirectly (2.43s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ExtraConfig (737.68s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ExtraConfig
functional_test.go:772: (dbg) Run:  out/minikube-linux-arm64 start -p functional-291288 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all
E1124 09:26:03.604295 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:26:24.717397 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:27:47.784657 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:29:06.675065 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:31:03.605372 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:31:24.717363 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:36:03.604712 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:36:24.717283 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
functional_test.go:772: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-291288 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all: exit status 109 (12m15.281946464s)

                                                
                                                
-- stdout --
	* [functional-291288] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=21978
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/21978-1652607/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/21978-1652607/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the docker driver based on existing profile
	* Starting "functional-291288" primary control-plane node in "functional-291288" cluster
	* Pulling base image v0.0.48-1763789673-21948 ...
	* Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	  - apiserver.enable-admission-plugins=NamespaceAutoProvision
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! Unable to restart control-plane node(s), will reset cluster: <no value>
	! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000251849s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	* 
	X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001230859s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001230859s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	* Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	* Related issue: https://github.com/kubernetes/minikube/issues/4172

                                                
                                                
** /stderr **
functional_test.go:774: failed to restart minikube. args "out/minikube-linux-arm64 start -p functional-291288 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all": exit status 109
functional_test.go:776: restart took 12m15.283169106s for "functional-291288" cluster.
I1124 09:37:59.173906 1654467 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ExtraConfig]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ExtraConfig]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect functional-291288
helpers_test.go:243: (dbg) docker inspect functional-291288:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52",
	        "Created": "2025-11-24T09:10:51.896020191Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 1695240,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-11-24T09:10:51.968983407Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:572c983e466f1f784136812eef5cc59ac623db764bc7704d3676c4643993fd08",
	        "ResolvConfPath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/hostname",
	        "HostsPath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/hosts",
	        "LogPath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52-json.log",
	        "Name": "/functional-291288",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "functional-291288:/var",
	                "/lib/modules:/lib/modules:ro"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-291288",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52",
	                "LowerDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7-init/diff:/var/lib/docker/overlay2/bf294786c7ade59cb761c7bdcc27045362c3779b00a569b685443f34ade17737/diff",
	                "MergedDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7/merged",
	                "UpperDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7/diff",
	                "WorkDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "volume",
	                "Name": "functional-291288",
	                "Source": "/var/lib/docker/volumes/functional-291288/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            },
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-291288",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-291288",
	                "name.minikube.sigs.k8s.io": "functional-291288",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "09c1c2eef0dca6362dde63b4cbc372c0cfa3e4fd084b8745043d8b88925691bf",
	            "SandboxKey": "/var/run/docker/netns/09c1c2eef0dc",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34684"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34685"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34688"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34686"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34687"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-291288": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "7e:49:22:0b:f9:2c",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "1e8f91e8ad9f46b831bbb1b0589b0022d940ee9875e64a648dc80612f3ca93dc",
	                    "EndpointID": "5de5ca8ccb07584b21e6e4e30dba12e0233e8d28c3e48e705cddffe75263b337",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-291288",
	                        "70848be15fcc"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-291288 -n functional-291288
helpers_test.go:247: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-291288 -n functional-291288: exit status 2 (310.779177ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:247: status error: exit status 2 (may be ok)
helpers_test.go:252: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ExtraConfig FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ExtraConfig]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 logs -n 25
helpers_test.go:255: (dbg) Done: out/minikube-linux-arm64 -p functional-291288 logs -n 25: (1.217278944s)
helpers_test.go:260: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ExtraConfig logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                          ARGS                                                                           │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ image   │ functional-941011 image ls --format yaml --alsologtostderr                                                                                              │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ ssh     │ functional-941011 ssh pgrep buildkitd                                                                                                                   │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │                     │
	│ image   │ functional-941011 image build -t localhost/my-image:functional-941011 testdata/build --alsologtostderr                                                  │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ image   │ functional-941011 image ls                                                                                                                              │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ image   │ functional-941011 image ls --format json --alsologtostderr                                                                                              │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ image   │ functional-941011 image ls --format table --alsologtostderr                                                                                             │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ delete  │ -p functional-941011                                                                                                                                    │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:10 UTC │ 24 Nov 25 09:10 UTC │
	│ start   │ -p functional-291288 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0 │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:10 UTC │                     │
	│ start   │ -p functional-291288 --alsologtostderr -v=8                                                                                                             │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:19 UTC │                     │
	│ cache   │ functional-291288 cache add registry.k8s.io/pause:3.1                                                                                                   │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ cache   │ functional-291288 cache add registry.k8s.io/pause:3.3                                                                                                   │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ cache   │ functional-291288 cache add registry.k8s.io/pause:latest                                                                                                │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ cache   │ functional-291288 cache add minikube-local-cache-test:functional-291288                                                                                 │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ cache   │ functional-291288 cache delete minikube-local-cache-test:functional-291288                                                                              │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.3                                                                                                                        │ minikube          │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ cache   │ list                                                                                                                                                    │ minikube          │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ ssh     │ functional-291288 ssh sudo crictl images                                                                                                                │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ ssh     │ functional-291288 ssh sudo crictl rmi registry.k8s.io/pause:latest                                                                                      │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ ssh     │ functional-291288 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                                 │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │                     │
	│ cache   │ functional-291288 cache reload                                                                                                                          │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ ssh     │ functional-291288 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                                 │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.1                                                                                                                        │ minikube          │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ cache   │ delete registry.k8s.io/pause:latest                                                                                                                     │ minikube          │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ kubectl │ functional-291288 kubectl -- --context functional-291288 get pods                                                                                       │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │                     │
	│ start   │ -p functional-291288 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all                                                │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │                     │
	└─────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/11/24 09:25:43
	Running on machine: ip-172-31-30-239
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1124 09:25:43.956868 1707070 out.go:360] Setting OutFile to fd 1 ...
	I1124 09:25:43.957002 1707070 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:25:43.957006 1707070 out.go:374] Setting ErrFile to fd 2...
	I1124 09:25:43.957010 1707070 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:25:43.957247 1707070 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
	I1124 09:25:43.957575 1707070 out.go:368] Setting JSON to false
	I1124 09:25:43.958421 1707070 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":29273,"bootTime":1763947071,"procs":157,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1124 09:25:43.958501 1707070 start.go:143] virtualization:  
	I1124 09:25:43.961954 1707070 out.go:179] * [functional-291288] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1124 09:25:43.965745 1707070 out.go:179]   - MINIKUBE_LOCATION=21978
	I1124 09:25:43.965806 1707070 notify.go:221] Checking for updates...
	I1124 09:25:43.971831 1707070 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1124 09:25:43.974596 1707070 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 09:25:43.977531 1707070 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21978-1652607/.minikube
	I1124 09:25:43.980447 1707070 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1124 09:25:43.983266 1707070 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1124 09:25:43.986897 1707070 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1124 09:25:43.986999 1707070 driver.go:422] Setting default libvirt URI to qemu:///system
	I1124 09:25:44.009686 1707070 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1124 09:25:44.009789 1707070 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 09:25:44.075505 1707070 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-11-24 09:25:44.065719192 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 09:25:44.075607 1707070 docker.go:319] overlay module found
	I1124 09:25:44.080493 1707070 out.go:179] * Using the docker driver based on existing profile
	I1124 09:25:44.083298 1707070 start.go:309] selected driver: docker
	I1124 09:25:44.083323 1707070 start.go:927] validating driver "docker" against &{Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disa
bleCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:25:44.083409 1707070 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1124 09:25:44.083513 1707070 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 09:25:44.137525 1707070 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-11-24 09:25:44.127840235 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 09:25:44.137959 1707070 start_flags.go:992] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1124 09:25:44.137984 1707070 cni.go:84] Creating CNI manager for ""
	I1124 09:25:44.138040 1707070 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1124 09:25:44.138097 1707070 start.go:353] cluster config:
	{Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local C
ontainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disab
leCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:25:44.143064 1707070 out.go:179] * Starting "functional-291288" primary control-plane node in "functional-291288" cluster
	I1124 09:25:44.145761 1707070 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1124 09:25:44.148578 1707070 out.go:179] * Pulling base image v0.0.48-1763789673-21948 ...
	I1124 09:25:44.151418 1707070 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1124 09:25:44.151496 1707070 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f in local docker daemon
	I1124 09:25:44.171581 1707070 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f in local docker daemon, skipping pull
	I1124 09:25:44.171593 1707070 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f exists in daemon, skipping load
	W1124 09:25:44.210575 1707070 preload.go:144] https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.35.0-beta.0/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	W1124 09:25:44.425167 1707070 preload.go:144] https://github.com/kubernetes-sigs/minikube-preloads/releases/download/v18/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	I1124 09:25:44.425335 1707070 profile.go:143] Saving config to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/config.json ...
	I1124 09:25:44.425459 1707070 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:25:44.425602 1707070 cache.go:243] Successfully downloaded all kic artifacts
	I1124 09:25:44.425631 1707070 start.go:360] acquireMachinesLock for functional-291288: {Name:mk85384dc057570e1f34db593d357cea738652c4 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.425681 1707070 start.go:364] duration metric: took 28.381µs to acquireMachinesLock for "functional-291288"
	I1124 09:25:44.425694 1707070 start.go:96] Skipping create...Using existing machine configuration
	I1124 09:25:44.425698 1707070 fix.go:54] fixHost starting: 
	I1124 09:25:44.425962 1707070 cli_runner.go:164] Run: docker container inspect functional-291288 --format={{.State.Status}}
	I1124 09:25:44.443478 1707070 fix.go:112] recreateIfNeeded on functional-291288: state=Running err=<nil>
	W1124 09:25:44.443512 1707070 fix.go:138] unexpected machine state, will restart: <nil>
	I1124 09:25:44.447296 1707070 out.go:252] * Updating the running docker "functional-291288" container ...
	I1124 09:25:44.447326 1707070 machine.go:94] provisionDockerMachine start ...
	I1124 09:25:44.447405 1707070 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:25:44.465953 1707070 main.go:143] libmachine: Using SSH client type: native
	I1124 09:25:44.466284 1707070 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34684 <nil> <nil>}
	I1124 09:25:44.466291 1707070 main.go:143] libmachine: About to run SSH command:
	hostname
	I1124 09:25:44.603673 1707070 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:25:44.618572 1707070 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-291288
	
	I1124 09:25:44.618586 1707070 ubuntu.go:182] provisioning hostname "functional-291288"
	I1124 09:25:44.618668 1707070 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:25:44.659382 1707070 main.go:143] libmachine: Using SSH client type: native
	I1124 09:25:44.659732 1707070 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34684 <nil> <nil>}
	I1124 09:25:44.659741 1707070 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-291288 && echo "functional-291288" | sudo tee /etc/hostname
	I1124 09:25:44.806505 1707070 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:25:44.844189 1707070 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-291288
	
	I1124 09:25:44.844281 1707070 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:25:44.868659 1707070 main.go:143] libmachine: Using SSH client type: native
	I1124 09:25:44.869019 1707070 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34684 <nil> <nil>}
	I1124 09:25:44.869041 1707070 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-291288' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-291288/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-291288' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1124 09:25:44.979106 1707070 cache.go:107] acquiring lock: {Name:mk22a10f0ce1f3295b61e7e76c455d0494a3e278 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.979193 1707070 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 exists
	I1124 09:25:44.979201 1707070 cache.go:96] cache image "gcr.io/k8s-minikube/storage-provisioner:v5" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5" took 127.862µs
	I1124 09:25:44.979207 1707070 cache.go:80] save to tar file gcr.io/k8s-minikube/storage-provisioner:v5 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 succeeded
	I1124 09:25:44.979198 1707070 cache.go:107] acquiring lock: {Name:mk80fdbe7cdb5bc17c2a82b4ecfd00214559a435 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.979218 1707070 cache.go:107] acquiring lock: {Name:mk85f1502dbb97830776608fb729eb3605e112e6 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.979237 1707070 cache.go:107] acquiring lock: {Name:mk46ce3b59d7e062b3dbc8a90fe5b4231f256471 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.979267 1707070 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 exists
	I1124 09:25:44.979266 1707070 cache.go:107] acquiring lock: {Name:mk1cf42e67442503a46c578224bd3cb68bf682d4 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.979273 1707070 cache.go:96] cache image "registry.k8s.io/pause:3.10.1" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1" took 55.992µs
	I1124 09:25:44.979277 1707070 cache.go:80] save to tar file registry.k8s.io/pause:3.10.1 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 succeeded
	I1124 09:25:44.979285 1707070 cache.go:107] acquiring lock: {Name:mk726502cb84c177b2e14fee88512325761511c5 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.979301 1707070 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 exists
	I1124 09:25:44.979310 1707070 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 exists
	I1124 09:25:44.979308 1707070 cache.go:96] cache image "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0" took 43.274µs
	I1124 09:25:44.979314 1707070 cache.go:96] cache image "registry.k8s.io/coredns/coredns:v1.13.1" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1" took 29.982µs
	I1124 09:25:44.979319 1707070 cache.go:80] save to tar file registry.k8s.io/coredns/coredns:v1.13.1 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 succeeded
	I1124 09:25:44.979319 1707070 cache.go:80] save to tar file registry.k8s.io/kube-apiserver:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 succeeded
	I1124 09:25:44.979326 1707070 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0 exists
	I1124 09:25:44.979330 1707070 cache.go:96] cache image "registry.k8s.io/etcd:3.5.24-0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0" took 94.392µs
	I1124 09:25:44.979336 1707070 cache.go:80] save to tar file registry.k8s.io/etcd:3.5.24-0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0 succeeded
	I1124 09:25:44.979330 1707070 cache.go:107] acquiring lock: {Name:mkfdc49c8e68aee34cee0c9d441ae8a4dca675c6 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.979345 1707070 cache.go:107] acquiring lock: {Name:mkdbf38e05e2c47c1a7a906a2236e9e7020a94c5 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.979364 1707070 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 exists
	I1124 09:25:44.979370 1707070 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 exists
	I1124 09:25:44.979368 1707070 cache.go:96] cache image "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0" took 40.427µs
	I1124 09:25:44.979373 1707070 cache.go:96] cache image "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0" took 29.49µs
	I1124 09:25:44.979375 1707070 cache.go:80] save to tar file registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 succeeded
	I1124 09:25:44.979378 1707070 cache.go:80] save to tar file registry.k8s.io/kube-scheduler:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 succeeded
	I1124 09:25:44.979407 1707070 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 exists
	I1124 09:25:44.979413 1707070 cache.go:96] cache image "registry.k8s.io/kube-proxy:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0" took 225.709µs
	I1124 09:25:44.979418 1707070 cache.go:80] save to tar file registry.k8s.io/kube-proxy:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 succeeded
	I1124 09:25:44.979424 1707070 cache.go:87] Successfully saved all images to host disk.
	I1124 09:25:45.028668 1707070 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1124 09:25:45.028686 1707070 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/21978-1652607/.minikube CaCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/21978-1652607/.minikube}
	I1124 09:25:45.028706 1707070 ubuntu.go:190] setting up certificates
	I1124 09:25:45.028727 1707070 provision.go:84] configureAuth start
	I1124 09:25:45.028800 1707070 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-291288
	I1124 09:25:45.083635 1707070 provision.go:143] copyHostCerts
	I1124 09:25:45.083709 1707070 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem, removing ...
	I1124 09:25:45.083718 1707070 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem
	I1124 09:25:45.083806 1707070 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem (1679 bytes)
	I1124 09:25:45.083920 1707070 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem, removing ...
	I1124 09:25:45.083924 1707070 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem
	I1124 09:25:45.083951 1707070 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem (1078 bytes)
	I1124 09:25:45.084006 1707070 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem, removing ...
	I1124 09:25:45.084009 1707070 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem
	I1124 09:25:45.084038 1707070 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem (1123 bytes)
	I1124 09:25:45.084083 1707070 provision.go:117] generating server cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem org=jenkins.functional-291288 san=[127.0.0.1 192.168.49.2 functional-291288 localhost minikube]
	I1124 09:25:45.498574 1707070 provision.go:177] copyRemoteCerts
	I1124 09:25:45.498637 1707070 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1124 09:25:45.498677 1707070 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:25:45.520187 1707070 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:25:45.626724 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1124 09:25:45.644660 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1124 09:25:45.663269 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1124 09:25:45.681392 1707070 provision.go:87] duration metric: took 652.643227ms to configureAuth
	I1124 09:25:45.681410 1707070 ubuntu.go:206] setting minikube options for container-runtime
	I1124 09:25:45.681611 1707070 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1124 09:25:45.681617 1707070 machine.go:97] duration metric: took 1.234286229s to provisionDockerMachine
	I1124 09:25:45.681624 1707070 start.go:293] postStartSetup for "functional-291288" (driver="docker")
	I1124 09:25:45.681634 1707070 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1124 09:25:45.681687 1707070 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1124 09:25:45.681727 1707070 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:25:45.698790 1707070 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:25:45.802503 1707070 ssh_runner.go:195] Run: cat /etc/os-release
	I1124 09:25:45.805922 1707070 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1124 09:25:45.805944 1707070 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1124 09:25:45.805954 1707070 filesync.go:126] Scanning /home/jenkins/minikube-integration/21978-1652607/.minikube/addons for local assets ...
	I1124 09:25:45.806011 1707070 filesync.go:126] Scanning /home/jenkins/minikube-integration/21978-1652607/.minikube/files for local assets ...
	I1124 09:25:45.806087 1707070 filesync.go:149] local asset: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem -> 16544672.pem in /etc/ssl/certs
	I1124 09:25:45.806167 1707070 filesync.go:149] local asset: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/test/nested/copy/1654467/hosts -> hosts in /etc/test/nested/copy/1654467
	I1124 09:25:45.806257 1707070 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/1654467
	I1124 09:25:45.814093 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem --> /etc/ssl/certs/16544672.pem (1708 bytes)
	I1124 09:25:45.832308 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/test/nested/copy/1654467/hosts --> /etc/test/nested/copy/1654467/hosts (40 bytes)
	I1124 09:25:45.850625 1707070 start.go:296] duration metric: took 168.9873ms for postStartSetup
	I1124 09:25:45.850696 1707070 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1124 09:25:45.850734 1707070 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:25:45.868479 1707070 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:25:45.971382 1707070 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1124 09:25:45.976655 1707070 fix.go:56] duration metric: took 1.550948262s for fixHost
	I1124 09:25:45.976671 1707070 start.go:83] releasing machines lock for "functional-291288", held for 1.550982815s
	I1124 09:25:45.976739 1707070 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-291288
	I1124 09:25:45.997505 1707070 ssh_runner.go:195] Run: cat /version.json
	I1124 09:25:45.997527 1707070 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1124 09:25:45.997550 1707070 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:25:45.997588 1707070 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:25:46.017321 1707070 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:25:46.018732 1707070 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:25:46.118131 1707070 ssh_runner.go:195] Run: systemctl --version
	I1124 09:25:46.213854 1707070 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1124 09:25:46.218087 1707070 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1124 09:25:46.218149 1707070 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1124 09:25:46.225944 1707070 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1124 09:25:46.225958 1707070 start.go:496] detecting cgroup driver to use...
	I1124 09:25:46.225989 1707070 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1124 09:25:46.226035 1707070 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1124 09:25:46.241323 1707070 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1124 09:25:46.254720 1707070 docker.go:218] disabling cri-docker service (if available) ...
	I1124 09:25:46.254789 1707070 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1124 09:25:46.270340 1707070 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1124 09:25:46.283549 1707070 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1124 09:25:46.399926 1707070 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1124 09:25:46.515234 1707070 docker.go:234] disabling docker service ...
	I1124 09:25:46.515290 1707070 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1124 09:25:46.529899 1707070 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1124 09:25:46.543047 1707070 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1124 09:25:46.658532 1707070 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1124 09:25:46.775880 1707070 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1124 09:25:46.790551 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1124 09:25:46.806411 1707070 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:25:46.967053 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1124 09:25:46.977583 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1124 09:25:46.986552 1707070 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1124 09:25:46.986618 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1124 09:25:46.995635 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1124 09:25:47.005680 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1124 09:25:47.015425 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1124 09:25:47.024808 1707070 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1124 09:25:47.033022 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1124 09:25:47.041980 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1124 09:25:47.051362 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1124 09:25:47.060469 1707070 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1124 09:25:47.068004 1707070 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1124 09:25:47.075326 1707070 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1124 09:25:47.191217 1707070 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1124 09:25:47.313892 1707070 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1124 09:25:47.313955 1707070 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1124 09:25:47.318001 1707070 start.go:564] Will wait 60s for crictl version
	I1124 09:25:47.318060 1707070 ssh_runner.go:195] Run: which crictl
	I1124 09:25:47.321766 1707070 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1124 09:25:47.347974 1707070 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.1.5
	RuntimeApiVersion:  v1
	I1124 09:25:47.348042 1707070 ssh_runner.go:195] Run: containerd --version
	I1124 09:25:47.369074 1707070 ssh_runner.go:195] Run: containerd --version
	I1124 09:25:47.394675 1707070 out.go:179] * Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	I1124 09:25:47.397593 1707070 cli_runner.go:164] Run: docker network inspect functional-291288 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1124 09:25:47.412872 1707070 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1124 09:25:47.419437 1707070 out.go:179]   - apiserver.enable-admission-plugins=NamespaceAutoProvision
	I1124 09:25:47.422135 1707070 kubeadm.go:884] updating cluster {Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikub
eCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker Bina
ryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1124 09:25:47.422352 1707070 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:25:47.578507 1707070 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:25:47.745390 1707070 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:25:47.894887 1707070 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1124 09:25:47.894982 1707070 ssh_runner.go:195] Run: sudo crictl images --output json
	I1124 09:25:47.919585 1707070 containerd.go:627] all images are preloaded for containerd runtime.
	I1124 09:25:47.919604 1707070 cache_images.go:86] Images are preloaded, skipping loading
	I1124 09:25:47.919612 1707070 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-beta.0 containerd true true} ...
	I1124 09:25:47.919707 1707070 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-291288 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1124 09:25:47.919778 1707070 ssh_runner.go:195] Run: sudo crictl info
	I1124 09:25:47.948265 1707070 extraconfig.go:125] Overwriting default enable-admission-plugins=NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota with user provided enable-admission-plugins=NamespaceAutoProvision for component apiserver
	I1124 09:25:47.948285 1707070 cni.go:84] Creating CNI manager for ""
	I1124 09:25:47.948293 1707070 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1124 09:25:47.948308 1707070 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1124 09:25:47.948331 1707070 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-beta.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-291288 NodeName:functional-291288 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceAutoProvision] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false Kubel
etConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1124 09:25:47.948441 1707070 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-291288"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceAutoProvision"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-beta.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1124 09:25:47.948507 1707070 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0
	I1124 09:25:47.956183 1707070 binaries.go:51] Found k8s binaries, skipping transfer
	I1124 09:25:47.956246 1707070 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1124 09:25:47.963641 1707070 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (328 bytes)
	I1124 09:25:47.976586 1707070 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (359 bytes)
	I1124 09:25:47.989056 1707070 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2087 bytes)
	I1124 09:25:48.003961 1707070 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1124 09:25:48.011533 1707070 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1124 09:25:48.134407 1707070 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1124 09:25:48.383061 1707070 certs.go:69] Setting up /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288 for IP: 192.168.49.2
	I1124 09:25:48.383072 1707070 certs.go:195] generating shared ca certs ...
	I1124 09:25:48.383086 1707070 certs.go:227] acquiring lock for ca certs: {Name:mkbe540a30c4376a351176f7fe6fec044d058b09 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 09:25:48.383238 1707070 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.key
	I1124 09:25:48.383279 1707070 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.key
	I1124 09:25:48.383286 1707070 certs.go:257] generating profile certs ...
	I1124 09:25:48.383366 1707070 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.key
	I1124 09:25:48.383420 1707070 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.key.5acb2515
	I1124 09:25:48.383456 1707070 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.key
	I1124 09:25:48.383562 1707070 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467.pem (1338 bytes)
	W1124 09:25:48.383598 1707070 certs.go:480] ignoring /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467_empty.pem, impossibly tiny 0 bytes
	I1124 09:25:48.383605 1707070 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem (1671 bytes)
	I1124 09:25:48.383632 1707070 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem (1078 bytes)
	I1124 09:25:48.383655 1707070 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem (1123 bytes)
	I1124 09:25:48.383684 1707070 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem (1679 bytes)
	I1124 09:25:48.383730 1707070 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem (1708 bytes)
	I1124 09:25:48.384294 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1124 09:25:48.403533 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1124 09:25:48.421212 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1124 09:25:48.441887 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1124 09:25:48.462311 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1124 09:25:48.480889 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1124 09:25:48.499086 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1124 09:25:48.517112 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1124 09:25:48.535554 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem --> /usr/share/ca-certificates/16544672.pem (1708 bytes)
	I1124 09:25:48.553310 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1124 09:25:48.571447 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467.pem --> /usr/share/ca-certificates/1654467.pem (1338 bytes)
	I1124 09:25:48.589094 1707070 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1124 09:25:48.602393 1707070 ssh_runner.go:195] Run: openssl version
	I1124 09:25:48.608953 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/16544672.pem && ln -fs /usr/share/ca-certificates/16544672.pem /etc/ssl/certs/16544672.pem"
	I1124 09:25:48.617886 1707070 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/16544672.pem
	I1124 09:25:48.621697 1707070 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Nov 24 09:10 /usr/share/ca-certificates/16544672.pem
	I1124 09:25:48.621756 1707070 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/16544672.pem
	I1124 09:25:48.663214 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/16544672.pem /etc/ssl/certs/3ec20f2e.0"
	I1124 09:25:48.671328 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1124 09:25:48.679977 1707070 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:25:48.683961 1707070 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Nov 24 08:44 /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:25:48.684024 1707070 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:25:48.725273 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1124 09:25:48.733278 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1654467.pem && ln -fs /usr/share/ca-certificates/1654467.pem /etc/ssl/certs/1654467.pem"
	I1124 09:25:48.741887 1707070 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1654467.pem
	I1124 09:25:48.745440 1707070 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Nov 24 09:10 /usr/share/ca-certificates/1654467.pem
	I1124 09:25:48.745500 1707070 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1654467.pem
	I1124 09:25:48.791338 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1654467.pem /etc/ssl/certs/51391683.0"
	I1124 09:25:48.799503 1707070 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1124 09:25:48.803145 1707070 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1124 09:25:48.844016 1707070 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1124 09:25:48.884962 1707070 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1124 09:25:48.926044 1707070 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1124 09:25:48.967289 1707070 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1124 09:25:49.008697 1707070 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1124 09:25:49.049934 1707070 kubeadm.go:401] StartCluster: {Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikubeCA
APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryM
irror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:25:49.050012 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1124 09:25:49.050074 1707070 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1124 09:25:49.080420 1707070 cri.go:89] found id: ""
	I1124 09:25:49.080484 1707070 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1124 09:25:49.088364 1707070 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1124 09:25:49.088374 1707070 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1124 09:25:49.088425 1707070 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1124 09:25:49.095680 1707070 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1124 09:25:49.096194 1707070 kubeconfig.go:125] found "functional-291288" server: "https://192.168.49.2:8441"
	I1124 09:25:49.097500 1707070 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1124 09:25:49.105267 1707070 kubeadm.go:645] detected kubeadm config drift (will reconfigure cluster from new /var/tmp/minikube/kubeadm.yaml):
	-- stdout --
	--- /var/tmp/minikube/kubeadm.yaml	2025-11-24 09:11:10.138797725 +0000
	+++ /var/tmp/minikube/kubeadm.yaml.new	2025-11-24 09:25:47.995648074 +0000
	@@ -24,7 +24,7 @@
	   certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	   extraArgs:
	     - name: "enable-admission-plugins"
	-      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	+      value: "NamespaceAutoProvision"
	 controllerManager:
	   extraArgs:
	     - name: "allocate-node-cidrs"
	
	-- /stdout --
	I1124 09:25:49.105285 1707070 kubeadm.go:1161] stopping kube-system containers ...
	I1124 09:25:49.105296 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name: Namespaces:[kube-system]}
	I1124 09:25:49.105351 1707070 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1124 09:25:49.142256 1707070 cri.go:89] found id: ""
	I1124 09:25:49.142317 1707070 ssh_runner.go:195] Run: sudo systemctl stop kubelet
	I1124 09:25:49.162851 1707070 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1124 09:25:49.170804 1707070 kubeadm.go:158] found existing configuration files:
	-rw------- 1 root root 5635 Nov 24 09:15 /etc/kubernetes/admin.conf
	-rw------- 1 root root 5636 Nov 24 09:15 /etc/kubernetes/controller-manager.conf
	-rw------- 1 root root 5672 Nov 24 09:15 /etc/kubernetes/kubelet.conf
	-rw------- 1 root root 5584 Nov 24 09:15 /etc/kubernetes/scheduler.conf
	
	I1124 09:25:49.170876 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1124 09:25:49.178603 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1124 09:25:49.185907 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1124 09:25:49.185964 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1124 09:25:49.193453 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1124 09:25:49.200815 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1124 09:25:49.200869 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1124 09:25:49.208328 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1124 09:25:49.215968 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1124 09:25:49.216025 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1124 09:25:49.223400 1707070 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1124 09:25:49.230953 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase certs all --config /var/tmp/minikube/kubeadm.yaml"
	I1124 09:25:49.277779 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml"
	I1124 09:25:50.308934 1707070 ssh_runner.go:235] Completed: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml": (1.031131442s)
	I1124 09:25:50.308993 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubelet-start --config /var/tmp/minikube/kubeadm.yaml"
	I1124 09:25:50.511648 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase control-plane all --config /var/tmp/minikube/kubeadm.yaml"
	I1124 09:25:50.576653 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase etcd local --config /var/tmp/minikube/kubeadm.yaml"
	I1124 09:25:50.625775 1707070 api_server.go:52] waiting for apiserver process to appear ...
	I1124 09:25:50.625855 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:51.126713 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:51.625939 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:52.126677 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:52.626053 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:53.126113 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:53.626972 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:54.126493 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:54.626036 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:55.126171 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:55.626853 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:56.126041 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:56.626177 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:57.126019 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:57.626847 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:58.126017 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:58.626716 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:59.125997 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:59.626367 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:00.125951 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:00.626013 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:01.126844 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:01.626038 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:02.126420 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:02.626727 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:03.126582 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:03.626068 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:04.126304 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:04.626830 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:05.126754 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:05.625961 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:06.126197 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:06.626039 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:07.126915 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:07.626052 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:08.126281 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:08.626116 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:09.126574 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:09.626068 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:10.125978 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:10.626328 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:11.126416 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:11.626073 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:12.126027 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:12.626174 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:13.126044 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:13.626781 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:14.126849 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:14.626203 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:15.125957 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:15.626068 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:16.126934 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:16.626382 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:17.126245 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:17.626034 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:18.126745 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:18.626942 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:19.126393 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:19.626607 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:20.126050 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:20.626732 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:21.126049 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:21.626115 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:22.125988 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:22.626261 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:23.126293 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:23.626107 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:24.126971 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:24.626009 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:25.126859 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:25.626876 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:26.126041 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:26.625983 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:27.126168 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:27.626079 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:28.126047 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:28.626761 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:29.126598 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:29.626290 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:30.125941 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:30.626102 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:31.126717 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:31.626588 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:32.126223 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:32.626875 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:33.126051 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:33.625963 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:34.126808 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:34.626621 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:35.126147 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:35.626018 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:36.126039 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:36.625970 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:37.126579 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:37.626198 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:38.126718 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:38.626386 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:39.126159 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:39.626590 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:40.126050 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:40.626422 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:41.126600 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:41.626097 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:42.127732 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:42.626108 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:43.126855 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:43.626202 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:44.126380 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:44.626423 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:45.127019 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:45.626257 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:46.125911 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:46.626125 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:47.126026 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:47.626915 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:48.126322 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:48.626706 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:49.126864 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:49.627009 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:50.126375 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:50.626418 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:26:50.626521 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:26:50.654529 1707070 cri.go:89] found id: ""
	I1124 09:26:50.654543 1707070 logs.go:282] 0 containers: []
	W1124 09:26:50.654550 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:26:50.654555 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:26:50.654624 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:26:50.683038 1707070 cri.go:89] found id: ""
	I1124 09:26:50.683052 1707070 logs.go:282] 0 containers: []
	W1124 09:26:50.683059 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:26:50.683064 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:26:50.683121 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:26:50.711396 1707070 cri.go:89] found id: ""
	I1124 09:26:50.711410 1707070 logs.go:282] 0 containers: []
	W1124 09:26:50.711422 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:26:50.711433 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:26:50.711498 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:26:50.735435 1707070 cri.go:89] found id: ""
	I1124 09:26:50.735449 1707070 logs.go:282] 0 containers: []
	W1124 09:26:50.735457 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:26:50.735463 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:26:50.735520 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:26:50.760437 1707070 cri.go:89] found id: ""
	I1124 09:26:50.760451 1707070 logs.go:282] 0 containers: []
	W1124 09:26:50.760458 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:26:50.760464 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:26:50.760520 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:26:50.785555 1707070 cri.go:89] found id: ""
	I1124 09:26:50.785576 1707070 logs.go:282] 0 containers: []
	W1124 09:26:50.785584 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:26:50.785590 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:26:50.785662 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:26:50.810261 1707070 cri.go:89] found id: ""
	I1124 09:26:50.810278 1707070 logs.go:282] 0 containers: []
	W1124 09:26:50.810286 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:26:50.810294 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:26:50.810305 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:26:50.879322 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:26:50.870488   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:50.871030   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:50.872890   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:50.873352   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:50.875005   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:26:50.870488   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:50.871030   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:50.872890   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:50.873352   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:50.875005   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:26:50.879334 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:26:50.879345 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:26:50.941117 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:26:50.941140 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:26:50.969259 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:26:50.969275 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:26:51.024741 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:26:51.024763 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:26:53.542977 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:53.553083 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:26:53.553155 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:26:53.577781 1707070 cri.go:89] found id: ""
	I1124 09:26:53.577795 1707070 logs.go:282] 0 containers: []
	W1124 09:26:53.577802 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:26:53.577808 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:26:53.577866 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:26:53.604191 1707070 cri.go:89] found id: ""
	I1124 09:26:53.604205 1707070 logs.go:282] 0 containers: []
	W1124 09:26:53.604212 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:26:53.604217 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:26:53.604277 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:26:53.632984 1707070 cri.go:89] found id: ""
	I1124 09:26:53.632998 1707070 logs.go:282] 0 containers: []
	W1124 09:26:53.633004 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:26:53.633010 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:26:53.633071 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:26:53.663828 1707070 cri.go:89] found id: ""
	I1124 09:26:53.663842 1707070 logs.go:282] 0 containers: []
	W1124 09:26:53.663850 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:26:53.663856 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:26:53.663912 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:26:53.695173 1707070 cri.go:89] found id: ""
	I1124 09:26:53.695187 1707070 logs.go:282] 0 containers: []
	W1124 09:26:53.695195 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:26:53.695200 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:26:53.695259 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:26:53.719882 1707070 cri.go:89] found id: ""
	I1124 09:26:53.719897 1707070 logs.go:282] 0 containers: []
	W1124 09:26:53.719904 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:26:53.719910 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:26:53.719993 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:26:53.753006 1707070 cri.go:89] found id: ""
	I1124 09:26:53.753020 1707070 logs.go:282] 0 containers: []
	W1124 09:26:53.753038 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:26:53.753046 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:26:53.753057 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:26:53.810839 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:26:53.810864 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:26:53.828132 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:26:53.828149 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:26:53.893802 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:26:53.885327   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:53.886130   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:53.888016   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:53.888539   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:53.890056   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:26:53.885327   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:53.886130   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:53.888016   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:53.888539   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:53.890056   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:26:53.893815 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:26:53.893825 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:26:53.955840 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:26:53.955860 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:26:56.485625 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:56.495752 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:26:56.495812 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:26:56.523600 1707070 cri.go:89] found id: ""
	I1124 09:26:56.523614 1707070 logs.go:282] 0 containers: []
	W1124 09:26:56.523622 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:26:56.523627 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:26:56.523730 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:26:56.547432 1707070 cri.go:89] found id: ""
	I1124 09:26:56.547445 1707070 logs.go:282] 0 containers: []
	W1124 09:26:56.547453 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:26:56.547465 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:26:56.547522 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:26:56.571895 1707070 cri.go:89] found id: ""
	I1124 09:26:56.571909 1707070 logs.go:282] 0 containers: []
	W1124 09:26:56.571917 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:26:56.571922 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:26:56.571977 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:26:56.596624 1707070 cri.go:89] found id: ""
	I1124 09:26:56.596637 1707070 logs.go:282] 0 containers: []
	W1124 09:26:56.596644 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:26:56.596650 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:26:56.596705 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:26:56.621497 1707070 cri.go:89] found id: ""
	I1124 09:26:56.621511 1707070 logs.go:282] 0 containers: []
	W1124 09:26:56.621518 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:26:56.621523 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:26:56.621588 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:26:56.656808 1707070 cri.go:89] found id: ""
	I1124 09:26:56.656822 1707070 logs.go:282] 0 containers: []
	W1124 09:26:56.656829 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:26:56.656834 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:26:56.656891 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:26:56.693750 1707070 cri.go:89] found id: ""
	I1124 09:26:56.693763 1707070 logs.go:282] 0 containers: []
	W1124 09:26:56.693770 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:26:56.693778 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:26:56.693799 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:26:56.711624 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:26:56.711642 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:26:56.772006 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:26:56.764543   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:56.764946   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:56.766216   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:56.766780   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:56.768382   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:26:56.764543   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:56.764946   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:56.766216   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:56.766780   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:56.768382   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:26:56.772020 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:26:56.772030 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:26:56.832784 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:26:56.832805 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:26:56.862164 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:26:56.862179 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:26:59.417328 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:59.427445 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:26:59.427506 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:26:59.451539 1707070 cri.go:89] found id: ""
	I1124 09:26:59.451574 1707070 logs.go:282] 0 containers: []
	W1124 09:26:59.451582 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:26:59.451588 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:26:59.451647 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:26:59.476110 1707070 cri.go:89] found id: ""
	I1124 09:26:59.476124 1707070 logs.go:282] 0 containers: []
	W1124 09:26:59.476131 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:26:59.476137 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:26:59.476194 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:26:59.504520 1707070 cri.go:89] found id: ""
	I1124 09:26:59.504533 1707070 logs.go:282] 0 containers: []
	W1124 09:26:59.504540 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:26:59.504546 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:26:59.504607 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:26:59.529647 1707070 cri.go:89] found id: ""
	I1124 09:26:59.529662 1707070 logs.go:282] 0 containers: []
	W1124 09:26:59.529669 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:26:59.529674 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:26:59.529753 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:26:59.558904 1707070 cri.go:89] found id: ""
	I1124 09:26:59.558918 1707070 logs.go:282] 0 containers: []
	W1124 09:26:59.558925 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:26:59.558930 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:26:59.558999 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:26:59.583698 1707070 cri.go:89] found id: ""
	I1124 09:26:59.583712 1707070 logs.go:282] 0 containers: []
	W1124 09:26:59.583733 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:26:59.583738 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:26:59.583800 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:26:59.607605 1707070 cri.go:89] found id: ""
	I1124 09:26:59.607619 1707070 logs.go:282] 0 containers: []
	W1124 09:26:59.607626 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:26:59.607634 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:26:59.607645 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:26:59.624446 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:26:59.624462 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:26:59.711588 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:26:59.701837   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:59.703242   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:59.704228   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:59.706009   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:59.706513   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:26:59.701837   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:59.703242   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:59.704228   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:59.706009   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:59.706513   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:26:59.711600 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:26:59.711610 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:26:59.777617 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:26:59.777638 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:26:59.810868 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:26:59.810888 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:02.368395 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:02.379444 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:02.379503 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:02.403995 1707070 cri.go:89] found id: ""
	I1124 09:27:02.404009 1707070 logs.go:282] 0 containers: []
	W1124 09:27:02.404017 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:02.404022 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:02.404080 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:02.428532 1707070 cri.go:89] found id: ""
	I1124 09:27:02.428546 1707070 logs.go:282] 0 containers: []
	W1124 09:27:02.428553 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:02.428559 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:02.428623 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:02.455148 1707070 cri.go:89] found id: ""
	I1124 09:27:02.455162 1707070 logs.go:282] 0 containers: []
	W1124 09:27:02.455169 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:02.455174 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:02.455233 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:02.479942 1707070 cri.go:89] found id: ""
	I1124 09:27:02.479957 1707070 logs.go:282] 0 containers: []
	W1124 09:27:02.479969 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:02.479975 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:02.480034 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:02.505728 1707070 cri.go:89] found id: ""
	I1124 09:27:02.505744 1707070 logs.go:282] 0 containers: []
	W1124 09:27:02.505751 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:02.505760 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:02.505845 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:02.536863 1707070 cri.go:89] found id: ""
	I1124 09:27:02.536881 1707070 logs.go:282] 0 containers: []
	W1124 09:27:02.536889 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:02.536894 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:02.536960 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:02.566083 1707070 cri.go:89] found id: ""
	I1124 09:27:02.566107 1707070 logs.go:282] 0 containers: []
	W1124 09:27:02.566124 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:02.566132 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:02.566142 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:02.628402 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:02.628423 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:02.669505 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:02.669523 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:02.737879 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:02.737907 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:02.755317 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:02.755334 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:02.820465 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:02.811248   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:02.812608   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:02.813513   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:02.815318   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:02.815727   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:02.811248   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:02.812608   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:02.813513   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:02.815318   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:02.815727   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:05.320749 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:05.331020 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:05.331081 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:05.355889 1707070 cri.go:89] found id: ""
	I1124 09:27:05.355904 1707070 logs.go:282] 0 containers: []
	W1124 09:27:05.355912 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:05.355917 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:05.355980 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:05.381650 1707070 cri.go:89] found id: ""
	I1124 09:27:05.381664 1707070 logs.go:282] 0 containers: []
	W1124 09:27:05.381671 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:05.381676 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:05.381733 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:05.410311 1707070 cri.go:89] found id: ""
	I1124 09:27:05.410325 1707070 logs.go:282] 0 containers: []
	W1124 09:27:05.410332 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:05.410337 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:05.410396 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:05.434601 1707070 cri.go:89] found id: ""
	I1124 09:27:05.434615 1707070 logs.go:282] 0 containers: []
	W1124 09:27:05.434621 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:05.434627 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:05.434684 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:05.459196 1707070 cri.go:89] found id: ""
	I1124 09:27:05.459210 1707070 logs.go:282] 0 containers: []
	W1124 09:27:05.459218 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:05.459223 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:05.459294 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:05.483433 1707070 cri.go:89] found id: ""
	I1124 09:27:05.483448 1707070 logs.go:282] 0 containers: []
	W1124 09:27:05.483455 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:05.483460 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:05.483523 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:05.508072 1707070 cri.go:89] found id: ""
	I1124 09:27:05.508086 1707070 logs.go:282] 0 containers: []
	W1124 09:27:05.508093 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:05.508101 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:05.508111 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:05.563733 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:05.563752 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:05.584705 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:05.584736 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:05.666380 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:05.657873   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:05.658740   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:05.660432   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:05.660828   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:05.662363   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:05.657873   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:05.658740   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:05.660432   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:05.660828   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:05.662363   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:05.666394 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:05.666405 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:05.738526 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:05.738548 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:08.268404 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:08.278347 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:08.278408 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:08.303562 1707070 cri.go:89] found id: ""
	I1124 09:27:08.303577 1707070 logs.go:282] 0 containers: []
	W1124 09:27:08.303585 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:08.303590 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:08.303651 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:08.329886 1707070 cri.go:89] found id: ""
	I1124 09:27:08.329900 1707070 logs.go:282] 0 containers: []
	W1124 09:27:08.329907 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:08.329913 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:08.329971 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:08.355081 1707070 cri.go:89] found id: ""
	I1124 09:27:08.355096 1707070 logs.go:282] 0 containers: []
	W1124 09:27:08.355104 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:08.355110 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:08.355175 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:08.381511 1707070 cri.go:89] found id: ""
	I1124 09:27:08.381534 1707070 logs.go:282] 0 containers: []
	W1124 09:27:08.381543 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:08.381549 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:08.381620 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:08.410606 1707070 cri.go:89] found id: ""
	I1124 09:27:08.410629 1707070 logs.go:282] 0 containers: []
	W1124 09:27:08.410637 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:08.410642 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:08.410700 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:08.434980 1707070 cri.go:89] found id: ""
	I1124 09:27:08.434994 1707070 logs.go:282] 0 containers: []
	W1124 09:27:08.435001 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:08.435007 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:08.435064 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:08.463780 1707070 cri.go:89] found id: ""
	I1124 09:27:08.463793 1707070 logs.go:282] 0 containers: []
	W1124 09:27:08.463800 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:08.463808 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:08.463819 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:08.527201 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:08.518614   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:08.519320   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:08.521220   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:08.521832   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:08.523649   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:08.518614   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:08.519320   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:08.521220   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:08.521832   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:08.523649   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:08.527213 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:08.527223 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:08.591559 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:08.591581 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:08.619107 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:08.619125 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:08.678658 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:08.678675 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:11.199028 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:11.209463 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:11.209529 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:11.236040 1707070 cri.go:89] found id: ""
	I1124 09:27:11.236061 1707070 logs.go:282] 0 containers: []
	W1124 09:27:11.236069 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:11.236075 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:11.236145 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:11.263895 1707070 cri.go:89] found id: ""
	I1124 09:27:11.263906 1707070 logs.go:282] 0 containers: []
	W1124 09:27:11.263912 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:11.263917 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:11.263968 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:11.290492 1707070 cri.go:89] found id: ""
	I1124 09:27:11.290507 1707070 logs.go:282] 0 containers: []
	W1124 09:27:11.290514 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:11.290519 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:11.290575 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:11.316763 1707070 cri.go:89] found id: ""
	I1124 09:27:11.316778 1707070 logs.go:282] 0 containers: []
	W1124 09:27:11.316785 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:11.316791 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:11.316899 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:11.340653 1707070 cri.go:89] found id: ""
	I1124 09:27:11.340668 1707070 logs.go:282] 0 containers: []
	W1124 09:27:11.340675 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:11.340680 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:11.340741 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:11.365000 1707070 cri.go:89] found id: ""
	I1124 09:27:11.365013 1707070 logs.go:282] 0 containers: []
	W1124 09:27:11.365020 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:11.365026 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:11.365086 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:11.393012 1707070 cri.go:89] found id: ""
	I1124 09:27:11.393025 1707070 logs.go:282] 0 containers: []
	W1124 09:27:11.393033 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:11.393041 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:11.393053 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:11.409740 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:11.409758 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:11.474068 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:11.465242   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:11.466095   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:11.467959   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:11.468588   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:11.470448   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:11.465242   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:11.466095   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:11.467959   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:11.468588   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:11.470448   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:11.474079 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:11.474089 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:11.535411 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:11.535433 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:11.565626 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:11.565645 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:14.123823 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:14.133770 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:14.133829 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:14.157476 1707070 cri.go:89] found id: ""
	I1124 09:27:14.157490 1707070 logs.go:282] 0 containers: []
	W1124 09:27:14.157497 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:14.157503 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:14.157562 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:14.188747 1707070 cri.go:89] found id: ""
	I1124 09:27:14.188761 1707070 logs.go:282] 0 containers: []
	W1124 09:27:14.188768 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:14.188773 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:14.188830 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:14.216257 1707070 cri.go:89] found id: ""
	I1124 09:27:14.216271 1707070 logs.go:282] 0 containers: []
	W1124 09:27:14.216279 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:14.216284 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:14.216345 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:14.241336 1707070 cri.go:89] found id: ""
	I1124 09:27:14.241349 1707070 logs.go:282] 0 containers: []
	W1124 09:27:14.241357 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:14.241362 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:14.241423 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:14.265223 1707070 cri.go:89] found id: ""
	I1124 09:27:14.265238 1707070 logs.go:282] 0 containers: []
	W1124 09:27:14.265245 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:14.265250 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:14.265312 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:14.292087 1707070 cri.go:89] found id: ""
	I1124 09:27:14.292101 1707070 logs.go:282] 0 containers: []
	W1124 09:27:14.292108 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:14.292114 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:14.292171 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:14.316839 1707070 cri.go:89] found id: ""
	I1124 09:27:14.316854 1707070 logs.go:282] 0 containers: []
	W1124 09:27:14.316861 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:14.316869 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:14.316879 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:14.371692 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:14.371715 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:14.388964 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:14.388980 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:14.455069 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:14.447375   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:14.448018   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:14.449517   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:14.449819   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:14.451683   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:14.447375   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:14.448018   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:14.449517   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:14.449819   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:14.451683   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:14.455080 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:14.455090 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:14.518102 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:14.518124 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:17.045537 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:17.055937 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:17.056004 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:17.084357 1707070 cri.go:89] found id: ""
	I1124 09:27:17.084370 1707070 logs.go:282] 0 containers: []
	W1124 09:27:17.084378 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:17.084383 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:17.084439 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:17.112022 1707070 cri.go:89] found id: ""
	I1124 09:27:17.112035 1707070 logs.go:282] 0 containers: []
	W1124 09:27:17.112043 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:17.112048 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:17.112110 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:17.135317 1707070 cri.go:89] found id: ""
	I1124 09:27:17.135331 1707070 logs.go:282] 0 containers: []
	W1124 09:27:17.135338 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:17.135343 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:17.135399 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:17.163850 1707070 cri.go:89] found id: ""
	I1124 09:27:17.163865 1707070 logs.go:282] 0 containers: []
	W1124 09:27:17.163872 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:17.163878 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:17.163933 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:17.188915 1707070 cri.go:89] found id: ""
	I1124 09:27:17.188929 1707070 logs.go:282] 0 containers: []
	W1124 09:27:17.188936 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:17.188941 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:17.188997 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:17.217448 1707070 cri.go:89] found id: ""
	I1124 09:27:17.217461 1707070 logs.go:282] 0 containers: []
	W1124 09:27:17.217475 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:17.217480 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:17.217537 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:17.242521 1707070 cri.go:89] found id: ""
	I1124 09:27:17.242536 1707070 logs.go:282] 0 containers: []
	W1124 09:27:17.242543 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:17.242551 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:17.242561 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:17.297899 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:17.297921 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:17.315278 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:17.315297 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:17.377620 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:17.368489   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:17.368893   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:17.370596   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:17.371050   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:17.372486   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:17.368489   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:17.368893   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:17.370596   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:17.371050   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:17.372486   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:17.377640 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:17.377651 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:17.439884 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:17.439907 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:19.969337 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:19.979536 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:19.979595 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:20.018198 1707070 cri.go:89] found id: ""
	I1124 09:27:20.018220 1707070 logs.go:282] 0 containers: []
	W1124 09:27:20.018229 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:20.018235 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:20.018297 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:20.046055 1707070 cri.go:89] found id: ""
	I1124 09:27:20.046070 1707070 logs.go:282] 0 containers: []
	W1124 09:27:20.046077 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:20.046082 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:20.046158 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:20.078159 1707070 cri.go:89] found id: ""
	I1124 09:27:20.078183 1707070 logs.go:282] 0 containers: []
	W1124 09:27:20.078191 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:20.078197 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:20.078289 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:20.104136 1707070 cri.go:89] found id: ""
	I1124 09:27:20.104151 1707070 logs.go:282] 0 containers: []
	W1124 09:27:20.104158 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:20.104164 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:20.104228 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:20.130266 1707070 cri.go:89] found id: ""
	I1124 09:27:20.130280 1707070 logs.go:282] 0 containers: []
	W1124 09:27:20.130288 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:20.130293 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:20.130352 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:20.156899 1707070 cri.go:89] found id: ""
	I1124 09:27:20.156913 1707070 logs.go:282] 0 containers: []
	W1124 09:27:20.156921 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:20.156926 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:20.156986 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:20.182706 1707070 cri.go:89] found id: ""
	I1124 09:27:20.182721 1707070 logs.go:282] 0 containers: []
	W1124 09:27:20.182728 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:20.182736 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:20.182747 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:20.240720 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:20.240740 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:20.257971 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:20.257987 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:20.324806 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:20.316231   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:20.316929   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:20.317881   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:20.319464   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:20.319918   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:20.316231   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:20.316929   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:20.317881   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:20.319464   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:20.319918   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:20.324827 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:20.324838 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:20.386188 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:20.386212 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:22.915679 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:22.927190 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:22.927254 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:22.959235 1707070 cri.go:89] found id: ""
	I1124 09:27:22.959249 1707070 logs.go:282] 0 containers: []
	W1124 09:27:22.959256 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:22.959262 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:22.959318 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:22.986124 1707070 cri.go:89] found id: ""
	I1124 09:27:22.986138 1707070 logs.go:282] 0 containers: []
	W1124 09:27:22.986146 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:22.986151 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:22.986206 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:23.016094 1707070 cri.go:89] found id: ""
	I1124 09:27:23.016108 1707070 logs.go:282] 0 containers: []
	W1124 09:27:23.016116 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:23.016121 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:23.016183 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:23.044417 1707070 cri.go:89] found id: ""
	I1124 09:27:23.044431 1707070 logs.go:282] 0 containers: []
	W1124 09:27:23.044439 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:23.044444 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:23.044501 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:23.069468 1707070 cri.go:89] found id: ""
	I1124 09:27:23.069484 1707070 logs.go:282] 0 containers: []
	W1124 09:27:23.069491 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:23.069497 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:23.069556 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:23.096521 1707070 cri.go:89] found id: ""
	I1124 09:27:23.096535 1707070 logs.go:282] 0 containers: []
	W1124 09:27:23.096542 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:23.096548 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:23.096605 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:23.125327 1707070 cri.go:89] found id: ""
	I1124 09:27:23.125342 1707070 logs.go:282] 0 containers: []
	W1124 09:27:23.125349 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:23.125358 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:23.125367 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:23.180584 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:23.180605 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:23.197372 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:23.197388 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:23.259943 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:23.251679   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:23.252410   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:23.253306   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:23.254866   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:23.255334   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:23.251679   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:23.252410   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:23.253306   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:23.254866   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:23.255334   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:23.259953 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:23.259965 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:23.325045 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:23.325066 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:25.855733 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:25.866329 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:25.866395 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:25.906494 1707070 cri.go:89] found id: ""
	I1124 09:27:25.906508 1707070 logs.go:282] 0 containers: []
	W1124 09:27:25.906516 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:25.906521 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:25.906590 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:25.945205 1707070 cri.go:89] found id: ""
	I1124 09:27:25.945229 1707070 logs.go:282] 0 containers: []
	W1124 09:27:25.945237 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:25.945242 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:25.945301 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:25.970721 1707070 cri.go:89] found id: ""
	I1124 09:27:25.970736 1707070 logs.go:282] 0 containers: []
	W1124 09:27:25.970743 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:25.970749 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:25.970807 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:25.997334 1707070 cri.go:89] found id: ""
	I1124 09:27:25.997348 1707070 logs.go:282] 0 containers: []
	W1124 09:27:25.997355 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:25.997364 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:25.997438 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:26.029916 1707070 cri.go:89] found id: ""
	I1124 09:27:26.029932 1707070 logs.go:282] 0 containers: []
	W1124 09:27:26.029940 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:26.029945 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:26.030007 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:26.057466 1707070 cri.go:89] found id: ""
	I1124 09:27:26.057480 1707070 logs.go:282] 0 containers: []
	W1124 09:27:26.057488 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:26.057494 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:26.057565 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:26.083489 1707070 cri.go:89] found id: ""
	I1124 09:27:26.083503 1707070 logs.go:282] 0 containers: []
	W1124 09:27:26.083511 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:26.083519 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:26.083529 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:26.140569 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:26.140588 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:26.158554 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:26.158571 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:26.230573 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:26.222615   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:26.223218   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:26.224819   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:26.225472   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:26.226976   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:26.222615   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:26.223218   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:26.224819   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:26.225472   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:26.226976   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:26.230583 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:26.230594 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:26.292417 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:26.292436 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:28.819944 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:28.830528 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:28.830587 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:28.854228 1707070 cri.go:89] found id: ""
	I1124 09:27:28.854243 1707070 logs.go:282] 0 containers: []
	W1124 09:27:28.854250 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:28.854260 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:28.854324 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:28.891203 1707070 cri.go:89] found id: ""
	I1124 09:27:28.891217 1707070 logs.go:282] 0 containers: []
	W1124 09:27:28.891224 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:28.891230 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:28.891305 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:28.918573 1707070 cri.go:89] found id: ""
	I1124 09:27:28.918587 1707070 logs.go:282] 0 containers: []
	W1124 09:27:28.918594 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:28.918600 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:28.918665 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:28.944672 1707070 cri.go:89] found id: ""
	I1124 09:27:28.944685 1707070 logs.go:282] 0 containers: []
	W1124 09:27:28.944692 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:28.944708 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:28.944763 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:28.970414 1707070 cri.go:89] found id: ""
	I1124 09:27:28.970429 1707070 logs.go:282] 0 containers: []
	W1124 09:27:28.970436 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:28.970441 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:28.970539 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:28.995438 1707070 cri.go:89] found id: ""
	I1124 09:27:28.995453 1707070 logs.go:282] 0 containers: []
	W1124 09:27:28.995460 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:28.995466 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:28.995526 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:29.023817 1707070 cri.go:89] found id: ""
	I1124 09:27:29.023832 1707070 logs.go:282] 0 containers: []
	W1124 09:27:29.023839 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:29.023847 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:29.023858 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:29.080316 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:29.080336 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:29.097486 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:29.097502 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:29.159875 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:29.151793   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:29.152163   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:29.153608   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:29.154019   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:29.155829   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:29.151793   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:29.152163   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:29.153608   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:29.154019   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:29.155829   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:29.159888 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:29.159907 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:29.223729 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:29.223754 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:31.751641 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:31.761798 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:31.761859 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:31.788691 1707070 cri.go:89] found id: ""
	I1124 09:27:31.788705 1707070 logs.go:282] 0 containers: []
	W1124 09:27:31.788711 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:31.788717 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:31.788776 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:31.812359 1707070 cri.go:89] found id: ""
	I1124 09:27:31.812374 1707070 logs.go:282] 0 containers: []
	W1124 09:27:31.812382 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:31.812387 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:31.812450 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:31.837276 1707070 cri.go:89] found id: ""
	I1124 09:27:31.837289 1707070 logs.go:282] 0 containers: []
	W1124 09:27:31.837296 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:31.837302 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:31.837360 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:31.862818 1707070 cri.go:89] found id: ""
	I1124 09:27:31.862832 1707070 logs.go:282] 0 containers: []
	W1124 09:27:31.862840 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:31.862846 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:31.862903 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:31.904922 1707070 cri.go:89] found id: ""
	I1124 09:27:31.904936 1707070 logs.go:282] 0 containers: []
	W1124 09:27:31.904944 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:31.904950 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:31.905012 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:31.949580 1707070 cri.go:89] found id: ""
	I1124 09:27:31.949594 1707070 logs.go:282] 0 containers: []
	W1124 09:27:31.949601 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:31.949607 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:31.949661 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:31.975157 1707070 cri.go:89] found id: ""
	I1124 09:27:31.975171 1707070 logs.go:282] 0 containers: []
	W1124 09:27:31.975178 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:31.975187 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:31.975198 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:32.004216 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:32.004239 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:32.064444 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:32.064466 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:32.084210 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:32.084229 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:32.152949 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:32.144237   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:32.145124   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:32.147159   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:32.147890   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:32.148900   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:32.144237   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:32.145124   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:32.147159   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:32.147890   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:32.148900   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:32.152963 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:32.152975 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:34.714493 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:34.725033 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:34.725101 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:34.750339 1707070 cri.go:89] found id: ""
	I1124 09:27:34.750352 1707070 logs.go:282] 0 containers: []
	W1124 09:27:34.750359 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:34.750365 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:34.750422 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:34.776574 1707070 cri.go:89] found id: ""
	I1124 09:27:34.776588 1707070 logs.go:282] 0 containers: []
	W1124 09:27:34.776595 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:34.776600 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:34.776656 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:34.801274 1707070 cri.go:89] found id: ""
	I1124 09:27:34.801288 1707070 logs.go:282] 0 containers: []
	W1124 09:27:34.801295 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:34.801300 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:34.801355 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:34.828204 1707070 cri.go:89] found id: ""
	I1124 09:27:34.828217 1707070 logs.go:282] 0 containers: []
	W1124 09:27:34.828224 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:34.828230 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:34.828286 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:34.856488 1707070 cri.go:89] found id: ""
	I1124 09:27:34.856502 1707070 logs.go:282] 0 containers: []
	W1124 09:27:34.856509 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:34.856514 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:34.856571 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:34.882889 1707070 cri.go:89] found id: ""
	I1124 09:27:34.882903 1707070 logs.go:282] 0 containers: []
	W1124 09:27:34.882914 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:34.882919 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:34.882988 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:34.914562 1707070 cri.go:89] found id: ""
	I1124 09:27:34.914576 1707070 logs.go:282] 0 containers: []
	W1124 09:27:34.914583 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:34.914591 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:34.914601 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:34.981562 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:34.981596 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:34.998925 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:34.998941 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:35.070877 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:35.062206   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:35.063028   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:35.064710   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:35.065308   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:35.067060   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:35.062206   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:35.063028   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:35.064710   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:35.065308   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:35.067060   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:35.070899 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:35.070909 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:35.137172 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:35.137193 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:37.666865 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:37.677121 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:37.677182 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:37.702376 1707070 cri.go:89] found id: ""
	I1124 09:27:37.702390 1707070 logs.go:282] 0 containers: []
	W1124 09:27:37.702398 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:37.702407 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:37.702491 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:37.727342 1707070 cri.go:89] found id: ""
	I1124 09:27:37.727355 1707070 logs.go:282] 0 containers: []
	W1124 09:27:37.727363 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:37.727368 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:37.727430 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:37.753323 1707070 cri.go:89] found id: ""
	I1124 09:27:37.753336 1707070 logs.go:282] 0 containers: []
	W1124 09:27:37.753343 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:37.753349 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:37.753409 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:37.781020 1707070 cri.go:89] found id: ""
	I1124 09:27:37.781041 1707070 logs.go:282] 0 containers: []
	W1124 09:27:37.781049 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:37.781055 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:37.781117 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:37.805925 1707070 cri.go:89] found id: ""
	I1124 09:27:37.805939 1707070 logs.go:282] 0 containers: []
	W1124 09:27:37.805946 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:37.805952 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:37.806013 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:37.833036 1707070 cri.go:89] found id: ""
	I1124 09:27:37.833062 1707070 logs.go:282] 0 containers: []
	W1124 09:27:37.833069 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:37.833075 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:37.833140 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:37.860115 1707070 cri.go:89] found id: ""
	I1124 09:27:37.860129 1707070 logs.go:282] 0 containers: []
	W1124 09:27:37.860137 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:37.860145 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:37.860156 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:37.926098 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:37.926118 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:37.960030 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:37.960045 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:38.019375 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:38.019395 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:38.039066 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:38.039085 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:38.110062 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:38.101570   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:38.102692   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:38.104495   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:38.105053   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:38.106366   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:38.101570   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:38.102692   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:38.104495   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:38.105053   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:38.106366   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:40.610482 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:40.620402 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:40.620472 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:40.648289 1707070 cri.go:89] found id: ""
	I1124 09:27:40.648303 1707070 logs.go:282] 0 containers: []
	W1124 09:27:40.648311 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:40.648317 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:40.648373 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:40.672588 1707070 cri.go:89] found id: ""
	I1124 09:27:40.672603 1707070 logs.go:282] 0 containers: []
	W1124 09:27:40.672610 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:40.672616 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:40.672673 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:40.700039 1707070 cri.go:89] found id: ""
	I1124 09:27:40.700053 1707070 logs.go:282] 0 containers: []
	W1124 09:27:40.700060 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:40.700066 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:40.700129 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:40.728494 1707070 cri.go:89] found id: ""
	I1124 09:27:40.728508 1707070 logs.go:282] 0 containers: []
	W1124 09:27:40.728516 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:40.728522 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:40.728582 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:40.753773 1707070 cri.go:89] found id: ""
	I1124 09:27:40.753786 1707070 logs.go:282] 0 containers: []
	W1124 09:27:40.753793 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:40.753798 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:40.753860 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:40.778243 1707070 cri.go:89] found id: ""
	I1124 09:27:40.778257 1707070 logs.go:282] 0 containers: []
	W1124 09:27:40.778264 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:40.778270 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:40.778333 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:40.804316 1707070 cri.go:89] found id: ""
	I1124 09:27:40.804329 1707070 logs.go:282] 0 containers: []
	W1124 09:27:40.804350 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:40.804358 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:40.804370 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:40.821314 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:40.821330 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:40.901213 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:40.878028   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:40.878824   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:40.894654   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:40.895170   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:40.896920   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:40.878028   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:40.878824   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:40.894654   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:40.895170   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:40.896920   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:40.901232 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:40.901242 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:40.972785 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:40.972806 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:41.000947 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:41.000967 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:43.560416 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:43.570821 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:43.570882 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:43.595557 1707070 cri.go:89] found id: ""
	I1124 09:27:43.595571 1707070 logs.go:282] 0 containers: []
	W1124 09:27:43.595579 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:43.595585 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:43.595640 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:43.623980 1707070 cri.go:89] found id: ""
	I1124 09:27:43.623996 1707070 logs.go:282] 0 containers: []
	W1124 09:27:43.624003 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:43.624008 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:43.624074 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:43.649674 1707070 cri.go:89] found id: ""
	I1124 09:27:43.649688 1707070 logs.go:282] 0 containers: []
	W1124 09:27:43.649695 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:43.649701 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:43.649758 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:43.673375 1707070 cri.go:89] found id: ""
	I1124 09:27:43.673388 1707070 logs.go:282] 0 containers: []
	W1124 09:27:43.673397 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:43.673403 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:43.673459 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:43.700917 1707070 cri.go:89] found id: ""
	I1124 09:27:43.700931 1707070 logs.go:282] 0 containers: []
	W1124 09:27:43.700938 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:43.700943 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:43.701000 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:43.725453 1707070 cri.go:89] found id: ""
	I1124 09:27:43.725467 1707070 logs.go:282] 0 containers: []
	W1124 09:27:43.725481 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:43.725487 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:43.725557 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:43.755304 1707070 cri.go:89] found id: ""
	I1124 09:27:43.755318 1707070 logs.go:282] 0 containers: []
	W1124 09:27:43.755326 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:43.755335 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:43.755346 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:43.772549 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:43.772567 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:43.837565 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:43.829378   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:43.829969   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:43.831587   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:43.832265   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:43.833938   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:43.829378   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:43.829969   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:43.831587   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:43.832265   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:43.833938   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:43.837575 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:43.837587 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:43.898949 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:43.898969 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:43.934259 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:43.934277 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:46.497111 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:46.507177 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:46.507251 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:46.531012 1707070 cri.go:89] found id: ""
	I1124 09:27:46.531025 1707070 logs.go:282] 0 containers: []
	W1124 09:27:46.531032 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:46.531038 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:46.531101 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:46.555781 1707070 cri.go:89] found id: ""
	I1124 09:27:46.555795 1707070 logs.go:282] 0 containers: []
	W1124 09:27:46.555802 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:46.555807 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:46.555864 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:46.580956 1707070 cri.go:89] found id: ""
	I1124 09:27:46.580974 1707070 logs.go:282] 0 containers: []
	W1124 09:27:46.580982 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:46.580987 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:46.581055 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:46.606320 1707070 cri.go:89] found id: ""
	I1124 09:27:46.606333 1707070 logs.go:282] 0 containers: []
	W1124 09:27:46.606340 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:46.606346 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:46.606414 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:46.632671 1707070 cri.go:89] found id: ""
	I1124 09:27:46.632685 1707070 logs.go:282] 0 containers: []
	W1124 09:27:46.632692 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:46.632697 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:46.632755 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:46.656948 1707070 cri.go:89] found id: ""
	I1124 09:27:46.656962 1707070 logs.go:282] 0 containers: []
	W1124 09:27:46.656969 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:46.656975 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:46.657037 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:46.681897 1707070 cri.go:89] found id: ""
	I1124 09:27:46.681910 1707070 logs.go:282] 0 containers: []
	W1124 09:27:46.681917 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:46.681925 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:46.681936 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:46.698822 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:46.698839 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:46.763473 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:46.755294   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:46.755864   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:46.757448   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:46.758065   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:46.759847   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:46.755294   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:46.755864   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:46.757448   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:46.758065   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:46.759847   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:46.763499 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:46.763510 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:46.826271 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:46.826293 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:46.855001 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:46.855017 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:49.412865 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:49.423511 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:49.423574 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:49.447618 1707070 cri.go:89] found id: ""
	I1124 09:27:49.447632 1707070 logs.go:282] 0 containers: []
	W1124 09:27:49.447639 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:49.447645 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:49.447705 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:49.476127 1707070 cri.go:89] found id: ""
	I1124 09:27:49.476140 1707070 logs.go:282] 0 containers: []
	W1124 09:27:49.476147 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:49.476154 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:49.476213 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:49.501684 1707070 cri.go:89] found id: ""
	I1124 09:27:49.501697 1707070 logs.go:282] 0 containers: []
	W1124 09:27:49.501705 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:49.501711 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:49.501771 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:49.527011 1707070 cri.go:89] found id: ""
	I1124 09:27:49.527025 1707070 logs.go:282] 0 containers: []
	W1124 09:27:49.527033 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:49.527038 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:49.527098 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:49.552026 1707070 cri.go:89] found id: ""
	I1124 09:27:49.552040 1707070 logs.go:282] 0 containers: []
	W1124 09:27:49.552047 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:49.552053 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:49.552110 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:49.582162 1707070 cri.go:89] found id: ""
	I1124 09:27:49.582189 1707070 logs.go:282] 0 containers: []
	W1124 09:27:49.582196 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:49.582202 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:49.582275 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:49.612653 1707070 cri.go:89] found id: ""
	I1124 09:27:49.612667 1707070 logs.go:282] 0 containers: []
	W1124 09:27:49.612675 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:49.612683 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:49.612693 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:49.668483 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:49.668504 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:49.685463 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:49.685480 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:49.750076 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:49.741868   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:49.742309   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:49.744083   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:49.744608   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:49.746375   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:49.741868   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:49.742309   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:49.744083   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:49.744608   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:49.746375   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:49.750136 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:49.750148 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:49.811614 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:49.811634 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:52.341239 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:52.351722 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:52.351784 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:52.378388 1707070 cri.go:89] found id: ""
	I1124 09:27:52.378402 1707070 logs.go:282] 0 containers: []
	W1124 09:27:52.378410 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:52.378416 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:52.378498 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:52.404052 1707070 cri.go:89] found id: ""
	I1124 09:27:52.404067 1707070 logs.go:282] 0 containers: []
	W1124 09:27:52.404074 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:52.404079 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:52.404138 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:52.428854 1707070 cri.go:89] found id: ""
	I1124 09:27:52.428868 1707070 logs.go:282] 0 containers: []
	W1124 09:27:52.428876 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:52.428882 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:52.428945 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:52.460795 1707070 cri.go:89] found id: ""
	I1124 09:27:52.460808 1707070 logs.go:282] 0 containers: []
	W1124 09:27:52.460815 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:52.460825 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:52.460886 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:52.490351 1707070 cri.go:89] found id: ""
	I1124 09:27:52.490365 1707070 logs.go:282] 0 containers: []
	W1124 09:27:52.490372 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:52.490378 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:52.490438 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:52.515789 1707070 cri.go:89] found id: ""
	I1124 09:27:52.515804 1707070 logs.go:282] 0 containers: []
	W1124 09:27:52.515811 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:52.515816 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:52.515874 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:52.544304 1707070 cri.go:89] found id: ""
	I1124 09:27:52.544318 1707070 logs.go:282] 0 containers: []
	W1124 09:27:52.544326 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:52.544335 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:52.544347 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:52.611718 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:52.603411   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:52.604016   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:52.605628   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:52.606175   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:52.607864   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:52.603411   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:52.604016   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:52.605628   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:52.606175   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:52.607864   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:52.611731 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:52.611743 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:52.679720 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:52.679740 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:52.708422 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:52.708437 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:52.766414 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:52.766433 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:55.285861 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:55.296023 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:55.296086 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:55.324396 1707070 cri.go:89] found id: ""
	I1124 09:27:55.324409 1707070 logs.go:282] 0 containers: []
	W1124 09:27:55.324417 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:55.324422 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:55.324478 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:55.348746 1707070 cri.go:89] found id: ""
	I1124 09:27:55.348760 1707070 logs.go:282] 0 containers: []
	W1124 09:27:55.348767 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:55.348773 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:55.348832 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:55.373685 1707070 cri.go:89] found id: ""
	I1124 09:27:55.373710 1707070 logs.go:282] 0 containers: []
	W1124 09:27:55.373718 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:55.373724 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:55.373780 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:55.399757 1707070 cri.go:89] found id: ""
	I1124 09:27:55.399774 1707070 logs.go:282] 0 containers: []
	W1124 09:27:55.399783 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:55.399789 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:55.399848 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:55.424773 1707070 cri.go:89] found id: ""
	I1124 09:27:55.424788 1707070 logs.go:282] 0 containers: []
	W1124 09:27:55.424795 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:55.424800 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:55.424862 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:55.450083 1707070 cri.go:89] found id: ""
	I1124 09:27:55.450097 1707070 logs.go:282] 0 containers: []
	W1124 09:27:55.450104 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:55.450112 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:55.450170 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:55.474225 1707070 cri.go:89] found id: ""
	I1124 09:27:55.474239 1707070 logs.go:282] 0 containers: []
	W1124 09:27:55.474247 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:55.474254 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:55.474264 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:55.507455 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:55.507477 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:55.563391 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:55.563414 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:55.583115 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:55.583131 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:55.648979 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:55.641409   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:55.642033   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:55.643543   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:55.644021   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:55.645529   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:55.641409   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:55.642033   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:55.643543   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:55.644021   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:55.645529   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:55.648991 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:55.649004 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:58.210584 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:58.221285 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:58.221351 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:58.250526 1707070 cri.go:89] found id: ""
	I1124 09:27:58.250541 1707070 logs.go:282] 0 containers: []
	W1124 09:27:58.250548 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:58.250554 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:58.250612 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:58.275099 1707070 cri.go:89] found id: ""
	I1124 09:27:58.275116 1707070 logs.go:282] 0 containers: []
	W1124 09:27:58.275123 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:58.275129 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:58.275189 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:58.300058 1707070 cri.go:89] found id: ""
	I1124 09:27:58.300075 1707070 logs.go:282] 0 containers: []
	W1124 09:27:58.300082 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:58.300087 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:58.300148 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:58.323564 1707070 cri.go:89] found id: ""
	I1124 09:27:58.323578 1707070 logs.go:282] 0 containers: []
	W1124 09:27:58.323585 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:58.323591 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:58.323648 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:58.348441 1707070 cri.go:89] found id: ""
	I1124 09:27:58.348455 1707070 logs.go:282] 0 containers: []
	W1124 09:27:58.348463 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:58.348468 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:58.348527 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:58.374283 1707070 cri.go:89] found id: ""
	I1124 09:27:58.374297 1707070 logs.go:282] 0 containers: []
	W1124 09:27:58.374305 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:58.374310 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:58.374371 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:58.400624 1707070 cri.go:89] found id: ""
	I1124 09:27:58.400638 1707070 logs.go:282] 0 containers: []
	W1124 09:27:58.400645 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:58.400653 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:58.400664 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:58.457055 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:58.457075 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:58.474204 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:58.474236 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:58.538738 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:58.530985   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:58.531628   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:58.533238   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:58.533555   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:58.535049   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:58.530985   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:58.531628   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:58.533238   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:58.533555   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:58.535049   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:58.538748 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:58.538761 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:58.601043 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:58.601064 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:01.129158 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:01.152628 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:01.152709 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:01.199688 1707070 cri.go:89] found id: ""
	I1124 09:28:01.199703 1707070 logs.go:282] 0 containers: []
	W1124 09:28:01.199710 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:01.199716 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:01.199778 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:01.226293 1707070 cri.go:89] found id: ""
	I1124 09:28:01.226307 1707070 logs.go:282] 0 containers: []
	W1124 09:28:01.226314 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:01.226319 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:01.226379 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:01.252021 1707070 cri.go:89] found id: ""
	I1124 09:28:01.252036 1707070 logs.go:282] 0 containers: []
	W1124 09:28:01.252043 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:01.252049 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:01.252108 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:01.278563 1707070 cri.go:89] found id: ""
	I1124 09:28:01.278577 1707070 logs.go:282] 0 containers: []
	W1124 09:28:01.278585 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:01.278591 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:01.278697 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:01.304781 1707070 cri.go:89] found id: ""
	I1124 09:28:01.304808 1707070 logs.go:282] 0 containers: []
	W1124 09:28:01.304816 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:01.304822 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:01.304900 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:01.330549 1707070 cri.go:89] found id: ""
	I1124 09:28:01.330574 1707070 logs.go:282] 0 containers: []
	W1124 09:28:01.330581 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:01.330586 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:01.330657 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:01.355624 1707070 cri.go:89] found id: ""
	I1124 09:28:01.355646 1707070 logs.go:282] 0 containers: []
	W1124 09:28:01.355654 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:01.355661 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:01.355673 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:01.411485 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:01.411504 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:01.428912 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:01.428927 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:01.493859 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:01.485758   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:01.486490   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:01.488127   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:01.488656   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:01.490257   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:01.485758   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:01.486490   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:01.488127   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:01.488656   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:01.490257   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:01.493881 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:01.493892 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:01.554787 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:01.554808 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:04.088481 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:04.099124 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:04.099191 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:04.123836 1707070 cri.go:89] found id: ""
	I1124 09:28:04.123849 1707070 logs.go:282] 0 containers: []
	W1124 09:28:04.123857 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:04.123862 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:04.123927 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:04.159485 1707070 cri.go:89] found id: ""
	I1124 09:28:04.159499 1707070 logs.go:282] 0 containers: []
	W1124 09:28:04.159506 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:04.159511 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:04.159572 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:04.187075 1707070 cri.go:89] found id: ""
	I1124 09:28:04.187089 1707070 logs.go:282] 0 containers: []
	W1124 09:28:04.187106 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:04.187112 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:04.187169 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:04.217664 1707070 cri.go:89] found id: ""
	I1124 09:28:04.217677 1707070 logs.go:282] 0 containers: []
	W1124 09:28:04.217696 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:04.217702 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:04.217769 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:04.244060 1707070 cri.go:89] found id: ""
	I1124 09:28:04.244075 1707070 logs.go:282] 0 containers: []
	W1124 09:28:04.244082 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:04.244087 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:04.244151 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:04.269297 1707070 cri.go:89] found id: ""
	I1124 09:28:04.269311 1707070 logs.go:282] 0 containers: []
	W1124 09:28:04.269318 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:04.269323 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:04.269382 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:04.296714 1707070 cri.go:89] found id: ""
	I1124 09:28:04.296730 1707070 logs.go:282] 0 containers: []
	W1124 09:28:04.296737 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:04.296745 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:04.296760 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:04.352538 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:04.352558 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:04.370334 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:04.370357 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:04.439006 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:04.429890   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:04.430808   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:04.432656   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:04.433242   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:04.435153   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:04.429890   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:04.430808   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:04.432656   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:04.433242   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:04.435153   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:04.439018 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:04.439027 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:04.503050 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:04.503072 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:07.038611 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:07.049789 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:07.049861 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:07.074863 1707070 cri.go:89] found id: ""
	I1124 09:28:07.074878 1707070 logs.go:282] 0 containers: []
	W1124 09:28:07.074885 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:07.074893 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:07.074950 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:07.099042 1707070 cri.go:89] found id: ""
	I1124 09:28:07.099057 1707070 logs.go:282] 0 containers: []
	W1124 09:28:07.099064 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:07.099070 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:07.099131 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:07.123608 1707070 cri.go:89] found id: ""
	I1124 09:28:07.123622 1707070 logs.go:282] 0 containers: []
	W1124 09:28:07.123630 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:07.123635 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:07.123706 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:07.151391 1707070 cri.go:89] found id: ""
	I1124 09:28:07.151405 1707070 logs.go:282] 0 containers: []
	W1124 09:28:07.151412 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:07.151418 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:07.151475 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:07.182488 1707070 cri.go:89] found id: ""
	I1124 09:28:07.182502 1707070 logs.go:282] 0 containers: []
	W1124 09:28:07.182510 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:07.182515 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:07.182581 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:07.207523 1707070 cri.go:89] found id: ""
	I1124 09:28:07.207537 1707070 logs.go:282] 0 containers: []
	W1124 09:28:07.207546 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:07.207552 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:07.207614 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:07.233412 1707070 cri.go:89] found id: ""
	I1124 09:28:07.233426 1707070 logs.go:282] 0 containers: []
	W1124 09:28:07.233433 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:07.233441 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:07.233451 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:07.288900 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:07.288922 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:07.306472 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:07.306493 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:07.368097 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:07.360574   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:07.360956   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:07.362483   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:07.362820   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:07.364269   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:07.360574   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:07.360956   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:07.362483   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:07.362820   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:07.364269   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:07.368108 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:07.368121 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:07.429983 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:07.430002 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:09.965289 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:09.976378 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:09.976448 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:10.015687 1707070 cri.go:89] found id: ""
	I1124 09:28:10.015705 1707070 logs.go:282] 0 containers: []
	W1124 09:28:10.015714 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:10.015721 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:10.015811 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:10.042717 1707070 cri.go:89] found id: ""
	I1124 09:28:10.042731 1707070 logs.go:282] 0 containers: []
	W1124 09:28:10.042738 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:10.042743 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:10.042805 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:10.069226 1707070 cri.go:89] found id: ""
	I1124 09:28:10.069240 1707070 logs.go:282] 0 containers: []
	W1124 09:28:10.069259 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:10.069265 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:10.069336 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:10.094576 1707070 cri.go:89] found id: ""
	I1124 09:28:10.094591 1707070 logs.go:282] 0 containers: []
	W1124 09:28:10.094599 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:10.094604 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:10.094683 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:10.120910 1707070 cri.go:89] found id: ""
	I1124 09:28:10.120925 1707070 logs.go:282] 0 containers: []
	W1124 09:28:10.120932 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:10.120938 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:10.121007 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:10.148454 1707070 cri.go:89] found id: ""
	I1124 09:28:10.148467 1707070 logs.go:282] 0 containers: []
	W1124 09:28:10.148476 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:10.148482 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:10.148545 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:10.180342 1707070 cri.go:89] found id: ""
	I1124 09:28:10.180356 1707070 logs.go:282] 0 containers: []
	W1124 09:28:10.180363 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:10.180377 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:10.180387 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:10.237982 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:10.238001 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:10.254875 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:10.254891 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:10.315902 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:10.307876   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:10.308640   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:10.310183   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:10.310727   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:10.312228   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:10.307876   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:10.308640   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:10.310183   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:10.310727   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:10.312228   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:10.315912 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:10.315922 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:10.381257 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:10.381276 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:12.913595 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:12.923674 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:12.923734 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:12.947804 1707070 cri.go:89] found id: ""
	I1124 09:28:12.947818 1707070 logs.go:282] 0 containers: []
	W1124 09:28:12.947826 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:12.947832 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:12.947892 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:12.971923 1707070 cri.go:89] found id: ""
	I1124 09:28:12.971937 1707070 logs.go:282] 0 containers: []
	W1124 09:28:12.971944 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:12.971956 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:12.972017 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:12.996325 1707070 cri.go:89] found id: ""
	I1124 09:28:12.996339 1707070 logs.go:282] 0 containers: []
	W1124 09:28:12.996357 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:12.996364 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:12.996436 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:13.022187 1707070 cri.go:89] found id: ""
	I1124 09:28:13.022203 1707070 logs.go:282] 0 containers: []
	W1124 09:28:13.022211 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:13.022224 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:13.022296 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:13.048161 1707070 cri.go:89] found id: ""
	I1124 09:28:13.048184 1707070 logs.go:282] 0 containers: []
	W1124 09:28:13.048192 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:13.048198 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:13.048262 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:13.073539 1707070 cri.go:89] found id: ""
	I1124 09:28:13.073564 1707070 logs.go:282] 0 containers: []
	W1124 09:28:13.073571 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:13.073578 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:13.073655 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:13.098089 1707070 cri.go:89] found id: ""
	I1124 09:28:13.098106 1707070 logs.go:282] 0 containers: []
	W1124 09:28:13.098114 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:13.098122 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:13.098132 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:13.140239 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:13.140255 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:13.197847 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:13.197865 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:13.217667 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:13.217686 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:13.281312 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:13.272865   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:13.273748   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:13.275370   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:13.275717   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:13.277237   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:13.272865   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:13.273748   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:13.275370   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:13.275717   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:13.277237   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:13.281322 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:13.281334 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:15.842684 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:15.853250 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:15.853311 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:15.878981 1707070 cri.go:89] found id: ""
	I1124 09:28:15.878995 1707070 logs.go:282] 0 containers: []
	W1124 09:28:15.879030 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:15.879036 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:15.879099 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:15.904674 1707070 cri.go:89] found id: ""
	I1124 09:28:15.904687 1707070 logs.go:282] 0 containers: []
	W1124 09:28:15.904695 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:15.904700 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:15.904757 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:15.929766 1707070 cri.go:89] found id: ""
	I1124 09:28:15.929780 1707070 logs.go:282] 0 containers: []
	W1124 09:28:15.929787 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:15.929793 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:15.929851 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:15.955453 1707070 cri.go:89] found id: ""
	I1124 09:28:15.955468 1707070 logs.go:282] 0 containers: []
	W1124 09:28:15.955475 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:15.955485 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:15.955543 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:15.983839 1707070 cri.go:89] found id: ""
	I1124 09:28:15.983854 1707070 logs.go:282] 0 containers: []
	W1124 09:28:15.983861 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:15.983866 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:15.983924 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:16.014730 1707070 cri.go:89] found id: ""
	I1124 09:28:16.014744 1707070 logs.go:282] 0 containers: []
	W1124 09:28:16.014752 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:16.014757 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:16.014820 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:16.046753 1707070 cri.go:89] found id: ""
	I1124 09:28:16.046767 1707070 logs.go:282] 0 containers: []
	W1124 09:28:16.046775 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:16.046783 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:16.046794 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:16.064199 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:16.064217 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:16.139691 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:16.122247   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:16.122923   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:16.124768   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:16.125231   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:16.126838   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:16.122247   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:16.122923   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:16.124768   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:16.125231   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:16.126838   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:16.139701 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:16.139711 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:16.206802 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:16.206822 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:16.234674 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:16.234690 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:18.790282 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:18.801848 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:18.801912 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:18.827821 1707070 cri.go:89] found id: ""
	I1124 09:28:18.827836 1707070 logs.go:282] 0 containers: []
	W1124 09:28:18.827843 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:18.827849 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:18.827905 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:18.852169 1707070 cri.go:89] found id: ""
	I1124 09:28:18.852184 1707070 logs.go:282] 0 containers: []
	W1124 09:28:18.852191 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:18.852196 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:18.852253 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:18.878610 1707070 cri.go:89] found id: ""
	I1124 09:28:18.878625 1707070 logs.go:282] 0 containers: []
	W1124 09:28:18.878633 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:18.878638 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:18.878702 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:18.903384 1707070 cri.go:89] found id: ""
	I1124 09:28:18.903403 1707070 logs.go:282] 0 containers: []
	W1124 09:28:18.903410 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:18.903416 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:18.903476 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:18.928519 1707070 cri.go:89] found id: ""
	I1124 09:28:18.928534 1707070 logs.go:282] 0 containers: []
	W1124 09:28:18.928542 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:18.928547 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:18.928609 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:18.956808 1707070 cri.go:89] found id: ""
	I1124 09:28:18.956823 1707070 logs.go:282] 0 containers: []
	W1124 09:28:18.956830 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:18.956836 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:18.956893 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:18.985113 1707070 cri.go:89] found id: ""
	I1124 09:28:18.985127 1707070 logs.go:282] 0 containers: []
	W1124 09:28:18.985134 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:18.985142 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:18.985152 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:19.019130 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:19.019146 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:19.075193 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:19.075213 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:19.092291 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:19.092306 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:19.162819 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:19.154959   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:19.155361   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:19.156834   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:19.157156   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:19.158629   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:19.154959   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:19.155361   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:19.156834   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:19.157156   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:19.158629   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:19.162839 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:19.162850 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:21.737895 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:21.748053 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:21.748120 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:21.773590 1707070 cri.go:89] found id: ""
	I1124 09:28:21.773604 1707070 logs.go:282] 0 containers: []
	W1124 09:28:21.773611 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:21.773618 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:21.773679 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:21.800809 1707070 cri.go:89] found id: ""
	I1124 09:28:21.800866 1707070 logs.go:282] 0 containers: []
	W1124 09:28:21.800874 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:21.800880 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:21.800938 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:21.826581 1707070 cri.go:89] found id: ""
	I1124 09:28:21.826594 1707070 logs.go:282] 0 containers: []
	W1124 09:28:21.826602 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:21.826607 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:21.826668 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:21.856267 1707070 cri.go:89] found id: ""
	I1124 09:28:21.856282 1707070 logs.go:282] 0 containers: []
	W1124 09:28:21.856289 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:21.856295 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:21.856354 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:21.885138 1707070 cri.go:89] found id: ""
	I1124 09:28:21.885152 1707070 logs.go:282] 0 containers: []
	W1124 09:28:21.885160 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:21.885165 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:21.885224 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:21.909643 1707070 cri.go:89] found id: ""
	I1124 09:28:21.909657 1707070 logs.go:282] 0 containers: []
	W1124 09:28:21.909665 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:21.909671 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:21.909727 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:21.936792 1707070 cri.go:89] found id: ""
	I1124 09:28:21.936806 1707070 logs.go:282] 0 containers: []
	W1124 09:28:21.936813 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:21.936821 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:21.936831 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:21.993870 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:21.993890 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:22.011453 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:22.011474 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:22.078376 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:22.069998   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:22.070791   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:22.072423   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:22.073020   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:22.074616   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:22.069998   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:22.070791   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:22.072423   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:22.073020   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:22.074616   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:22.078387 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:22.078398 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:22.140934 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:22.140953 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:24.669313 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:24.679257 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:24.679328 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:24.707632 1707070 cri.go:89] found id: ""
	I1124 09:28:24.707647 1707070 logs.go:282] 0 containers: []
	W1124 09:28:24.707654 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:24.707660 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:24.707720 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:24.733688 1707070 cri.go:89] found id: ""
	I1124 09:28:24.733702 1707070 logs.go:282] 0 containers: []
	W1124 09:28:24.733710 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:24.733715 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:24.733773 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:24.759056 1707070 cri.go:89] found id: ""
	I1124 09:28:24.759071 1707070 logs.go:282] 0 containers: []
	W1124 09:28:24.759078 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:24.759084 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:24.759143 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:24.789918 1707070 cri.go:89] found id: ""
	I1124 09:28:24.789931 1707070 logs.go:282] 0 containers: []
	W1124 09:28:24.789938 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:24.789944 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:24.790003 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:24.814684 1707070 cri.go:89] found id: ""
	I1124 09:28:24.814698 1707070 logs.go:282] 0 containers: []
	W1124 09:28:24.814709 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:24.814714 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:24.814773 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:24.839467 1707070 cri.go:89] found id: ""
	I1124 09:28:24.839489 1707070 logs.go:282] 0 containers: []
	W1124 09:28:24.839497 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:24.839503 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:24.839568 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:24.863902 1707070 cri.go:89] found id: ""
	I1124 09:28:24.863917 1707070 logs.go:282] 0 containers: []
	W1124 09:28:24.863925 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:24.863933 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:24.863943 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:24.919300 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:24.919320 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:24.936150 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:24.936167 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:24.998414 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:24.990181   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:24.990882   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:24.992541   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:24.993206   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:24.994900   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:24.990181   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:24.990882   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:24.992541   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:24.993206   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:24.994900   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:24.998425 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:24.998435 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:25.062735 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:25.062756 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:27.591381 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:27.601598 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:27.601658 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:27.626062 1707070 cri.go:89] found id: ""
	I1124 09:28:27.626076 1707070 logs.go:282] 0 containers: []
	W1124 09:28:27.626084 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:27.626090 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:27.626152 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:27.654571 1707070 cri.go:89] found id: ""
	I1124 09:28:27.654591 1707070 logs.go:282] 0 containers: []
	W1124 09:28:27.654599 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:27.654604 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:27.654664 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:27.679294 1707070 cri.go:89] found id: ""
	I1124 09:28:27.679308 1707070 logs.go:282] 0 containers: []
	W1124 09:28:27.679315 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:27.679320 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:27.679377 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:27.702575 1707070 cri.go:89] found id: ""
	I1124 09:28:27.702588 1707070 logs.go:282] 0 containers: []
	W1124 09:28:27.702595 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:27.702601 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:27.702657 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:27.728251 1707070 cri.go:89] found id: ""
	I1124 09:28:27.728266 1707070 logs.go:282] 0 containers: []
	W1124 09:28:27.728273 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:27.728279 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:27.728339 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:27.752789 1707070 cri.go:89] found id: ""
	I1124 09:28:27.752802 1707070 logs.go:282] 0 containers: []
	W1124 09:28:27.752809 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:27.752815 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:27.752874 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:27.776833 1707070 cri.go:89] found id: ""
	I1124 09:28:27.776847 1707070 logs.go:282] 0 containers: []
	W1124 09:28:27.776854 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:27.776862 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:27.776871 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:27.837612 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:27.837637 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:27.866873 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:27.866890 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:27.925473 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:27.925492 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:27.942415 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:27.942432 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:28.014797 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:27.999267   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:28.000058   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:28.002028   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:28.002995   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:28.005197   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:27.999267   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:28.000058   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:28.002028   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:28.002995   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:28.005197   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:30.515707 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:30.526026 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:30.526102 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:30.550904 1707070 cri.go:89] found id: ""
	I1124 09:28:30.550918 1707070 logs.go:282] 0 containers: []
	W1124 09:28:30.550925 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:30.550931 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:30.550996 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:30.580837 1707070 cri.go:89] found id: ""
	I1124 09:28:30.580851 1707070 logs.go:282] 0 containers: []
	W1124 09:28:30.580859 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:30.580864 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:30.580920 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:30.605291 1707070 cri.go:89] found id: ""
	I1124 09:28:30.605305 1707070 logs.go:282] 0 containers: []
	W1124 09:28:30.605312 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:30.605318 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:30.605376 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:30.630158 1707070 cri.go:89] found id: ""
	I1124 09:28:30.630172 1707070 logs.go:282] 0 containers: []
	W1124 09:28:30.630181 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:30.630187 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:30.630254 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:30.653754 1707070 cri.go:89] found id: ""
	I1124 09:28:30.653772 1707070 logs.go:282] 0 containers: []
	W1124 09:28:30.653785 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:30.653790 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:30.653868 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:30.679137 1707070 cri.go:89] found id: ""
	I1124 09:28:30.679150 1707070 logs.go:282] 0 containers: []
	W1124 09:28:30.679157 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:30.679163 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:30.679221 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:30.703850 1707070 cri.go:89] found id: ""
	I1124 09:28:30.703864 1707070 logs.go:282] 0 containers: []
	W1124 09:28:30.703871 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:30.703879 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:30.703888 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:30.772547 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:30.764218   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:30.764926   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:30.766593   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:30.767134   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:30.768991   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:30.764218   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:30.764926   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:30.766593   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:30.767134   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:30.768991   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:30.772557 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:30.772568 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:30.834024 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:30.834043 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:30.862031 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:30.862046 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:30.920292 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:30.920311 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:33.438606 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:33.448762 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:33.448822 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:33.481032 1707070 cri.go:89] found id: ""
	I1124 09:28:33.481046 1707070 logs.go:282] 0 containers: []
	W1124 09:28:33.481053 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:33.481060 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:33.481117 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:33.504561 1707070 cri.go:89] found id: ""
	I1124 09:28:33.504576 1707070 logs.go:282] 0 containers: []
	W1124 09:28:33.504583 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:33.504589 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:33.504654 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:33.528885 1707070 cri.go:89] found id: ""
	I1124 09:28:33.528899 1707070 logs.go:282] 0 containers: []
	W1124 09:28:33.528906 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:33.528915 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:33.528972 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:33.553244 1707070 cri.go:89] found id: ""
	I1124 09:28:33.553258 1707070 logs.go:282] 0 containers: []
	W1124 09:28:33.553271 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:33.553277 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:33.553334 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:33.578519 1707070 cri.go:89] found id: ""
	I1124 09:28:33.578533 1707070 logs.go:282] 0 containers: []
	W1124 09:28:33.578541 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:33.578546 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:33.578607 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:33.602708 1707070 cri.go:89] found id: ""
	I1124 09:28:33.602721 1707070 logs.go:282] 0 containers: []
	W1124 09:28:33.602729 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:33.602734 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:33.602791 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:33.626894 1707070 cri.go:89] found id: ""
	I1124 09:28:33.626908 1707070 logs.go:282] 0 containers: []
	W1124 09:28:33.626916 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:33.626923 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:33.626934 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:33.684867 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:33.684887 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:33.701817 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:33.701834 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:33.775161 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:33.766757   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:33.767480   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:33.769022   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:33.769484   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:33.770951   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:33.766757   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:33.767480   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:33.769022   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:33.769484   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:33.770951   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:33.775172 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:33.775185 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:33.837667 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:33.837688 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:36.365266 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:36.376558 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:36.376622 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:36.412692 1707070 cri.go:89] found id: ""
	I1124 09:28:36.412706 1707070 logs.go:282] 0 containers: []
	W1124 09:28:36.412714 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:36.412719 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:36.412777 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:36.448943 1707070 cri.go:89] found id: ""
	I1124 09:28:36.448957 1707070 logs.go:282] 0 containers: []
	W1124 09:28:36.448964 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:36.448970 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:36.449031 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:36.474906 1707070 cri.go:89] found id: ""
	I1124 09:28:36.474920 1707070 logs.go:282] 0 containers: []
	W1124 09:28:36.474928 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:36.474934 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:36.474990 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:36.503770 1707070 cri.go:89] found id: ""
	I1124 09:28:36.503784 1707070 logs.go:282] 0 containers: []
	W1124 09:28:36.503792 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:36.503797 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:36.503863 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:36.532858 1707070 cri.go:89] found id: ""
	I1124 09:28:36.532872 1707070 logs.go:282] 0 containers: []
	W1124 09:28:36.532880 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:36.532885 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:36.532944 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:36.557874 1707070 cri.go:89] found id: ""
	I1124 09:28:36.557889 1707070 logs.go:282] 0 containers: []
	W1124 09:28:36.557896 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:36.557902 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:36.557959 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:36.582175 1707070 cri.go:89] found id: ""
	I1124 09:28:36.582189 1707070 logs.go:282] 0 containers: []
	W1124 09:28:36.582204 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:36.582212 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:36.582230 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:36.645586 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:36.637487   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:36.638140   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:36.639873   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:36.640429   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:36.641968   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:36.637487   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:36.638140   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:36.639873   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:36.640429   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:36.641968   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:36.645596 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:36.645607 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:36.708211 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:36.708231 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:36.740877 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:36.740894 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:36.798376 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:36.798396 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:39.316746 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:39.327050 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:39.327111 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:39.351416 1707070 cri.go:89] found id: ""
	I1124 09:28:39.351430 1707070 logs.go:282] 0 containers: []
	W1124 09:28:39.351438 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:39.351444 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:39.351500 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:39.375341 1707070 cri.go:89] found id: ""
	I1124 09:28:39.375355 1707070 logs.go:282] 0 containers: []
	W1124 09:28:39.375362 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:39.375367 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:39.375425 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:39.402220 1707070 cri.go:89] found id: ""
	I1124 09:28:39.402235 1707070 logs.go:282] 0 containers: []
	W1124 09:28:39.402241 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:39.402247 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:39.402306 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:39.434081 1707070 cri.go:89] found id: ""
	I1124 09:28:39.434094 1707070 logs.go:282] 0 containers: []
	W1124 09:28:39.434101 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:39.434107 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:39.434167 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:39.467514 1707070 cri.go:89] found id: ""
	I1124 09:28:39.467528 1707070 logs.go:282] 0 containers: []
	W1124 09:28:39.467535 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:39.467540 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:39.467597 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:39.500947 1707070 cri.go:89] found id: ""
	I1124 09:28:39.500961 1707070 logs.go:282] 0 containers: []
	W1124 09:28:39.500968 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:39.500974 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:39.501034 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:39.526637 1707070 cri.go:89] found id: ""
	I1124 09:28:39.526651 1707070 logs.go:282] 0 containers: []
	W1124 09:28:39.526658 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:39.526666 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:39.526676 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:39.582247 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:39.582268 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:39.599751 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:39.599767 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:39.668271 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:39.660949   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:39.661446   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:39.663149   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:39.663643   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:39.664706   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:39.660949   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:39.661446   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:39.663149   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:39.663643   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:39.664706   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:39.668281 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:39.668294 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:39.730931 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:39.730951 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:42.260305 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:42.272405 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:42.272489 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:42.300817 1707070 cri.go:89] found id: ""
	I1124 09:28:42.300842 1707070 logs.go:282] 0 containers: []
	W1124 09:28:42.300850 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:42.300856 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:42.300921 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:42.327350 1707070 cri.go:89] found id: ""
	I1124 09:28:42.327368 1707070 logs.go:282] 0 containers: []
	W1124 09:28:42.327377 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:42.327382 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:42.327441 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:42.352768 1707070 cri.go:89] found id: ""
	I1124 09:28:42.352781 1707070 logs.go:282] 0 containers: []
	W1124 09:28:42.352788 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:42.352794 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:42.352858 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:42.384996 1707070 cri.go:89] found id: ""
	I1124 09:28:42.385016 1707070 logs.go:282] 0 containers: []
	W1124 09:28:42.385024 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:42.385035 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:42.385109 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:42.433916 1707070 cri.go:89] found id: ""
	I1124 09:28:42.433942 1707070 logs.go:282] 0 containers: []
	W1124 09:28:42.433963 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:42.433974 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:42.434041 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:42.469962 1707070 cri.go:89] found id: ""
	I1124 09:28:42.469976 1707070 logs.go:282] 0 containers: []
	W1124 09:28:42.469983 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:42.469989 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:42.470045 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:42.494905 1707070 cri.go:89] found id: ""
	I1124 09:28:42.494919 1707070 logs.go:282] 0 containers: []
	W1124 09:28:42.494926 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:42.494934 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:42.494944 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:42.551276 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:42.551295 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:42.568521 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:42.568538 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:42.631652 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:42.623578   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:42.624203   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:42.625718   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:42.626134   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:42.627653   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:42.623578   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:42.624203   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:42.625718   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:42.626134   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:42.627653   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:42.631662 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:42.631689 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:42.697554 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:42.697573 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:45.228012 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:45.242540 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:45.242663 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:45.285651 1707070 cri.go:89] found id: ""
	I1124 09:28:45.285666 1707070 logs.go:282] 0 containers: []
	W1124 09:28:45.285673 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:45.285679 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:45.285747 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:45.315729 1707070 cri.go:89] found id: ""
	I1124 09:28:45.315744 1707070 logs.go:282] 0 containers: []
	W1124 09:28:45.315759 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:45.315766 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:45.315838 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:45.342027 1707070 cri.go:89] found id: ""
	I1124 09:28:45.342041 1707070 logs.go:282] 0 containers: []
	W1124 09:28:45.342048 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:45.342053 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:45.342112 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:45.368019 1707070 cri.go:89] found id: ""
	I1124 09:28:45.368033 1707070 logs.go:282] 0 containers: []
	W1124 09:28:45.368040 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:45.368046 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:45.368102 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:45.406091 1707070 cri.go:89] found id: ""
	I1124 09:28:45.406104 1707070 logs.go:282] 0 containers: []
	W1124 09:28:45.406112 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:45.406119 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:45.406176 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:45.432356 1707070 cri.go:89] found id: ""
	I1124 09:28:45.432369 1707070 logs.go:282] 0 containers: []
	W1124 09:28:45.432377 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:45.432382 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:45.432449 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:45.465291 1707070 cri.go:89] found id: ""
	I1124 09:28:45.465315 1707070 logs.go:282] 0 containers: []
	W1124 09:28:45.465324 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:45.465332 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:45.465345 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:45.527756 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:45.527784 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:45.544616 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:45.544642 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:45.606842 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:45.598345   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:45.599427   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:45.600949   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:45.601550   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:45.603105   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:45.598345   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:45.599427   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:45.600949   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:45.601550   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:45.603105   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:45.606853 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:45.606866 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:45.669056 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:45.669077 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:48.198708 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:48.210384 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:48.210449 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:48.235268 1707070 cri.go:89] found id: ""
	I1124 09:28:48.235282 1707070 logs.go:282] 0 containers: []
	W1124 09:28:48.235289 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:48.235295 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:48.235357 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:48.261413 1707070 cri.go:89] found id: ""
	I1124 09:28:48.261427 1707070 logs.go:282] 0 containers: []
	W1124 09:28:48.261434 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:48.261439 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:48.261496 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:48.291100 1707070 cri.go:89] found id: ""
	I1124 09:28:48.291114 1707070 logs.go:282] 0 containers: []
	W1124 09:28:48.291122 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:48.291127 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:48.291186 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:48.326388 1707070 cri.go:89] found id: ""
	I1124 09:28:48.326412 1707070 logs.go:282] 0 containers: []
	W1124 09:28:48.326420 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:48.326426 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:48.326499 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:48.356212 1707070 cri.go:89] found id: ""
	I1124 09:28:48.356227 1707070 logs.go:282] 0 containers: []
	W1124 09:28:48.356234 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:48.356240 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:48.356299 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:48.384677 1707070 cri.go:89] found id: ""
	I1124 09:28:48.384690 1707070 logs.go:282] 0 containers: []
	W1124 09:28:48.384697 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:48.384703 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:48.384759 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:48.422001 1707070 cri.go:89] found id: ""
	I1124 09:28:48.422015 1707070 logs.go:282] 0 containers: []
	W1124 09:28:48.422022 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:48.422030 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:48.422040 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:48.492980 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:48.493001 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:48.522367 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:48.522383 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:48.577847 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:48.577866 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:48.594803 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:48.594821 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:48.662402 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:48.654176   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:48.655485   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:48.656131   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:48.657084   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:48.658755   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:48.654176   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:48.655485   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:48.656131   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:48.657084   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:48.658755   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:51.162680 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:51.173802 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:51.173865 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:51.200124 1707070 cri.go:89] found id: ""
	I1124 09:28:51.200146 1707070 logs.go:282] 0 containers: []
	W1124 09:28:51.200155 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:51.200161 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:51.200220 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:51.225309 1707070 cri.go:89] found id: ""
	I1124 09:28:51.225323 1707070 logs.go:282] 0 containers: []
	W1124 09:28:51.225330 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:51.225335 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:51.225392 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:51.249971 1707070 cri.go:89] found id: ""
	I1124 09:28:51.249985 1707070 logs.go:282] 0 containers: []
	W1124 09:28:51.249992 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:51.249997 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:51.250053 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:51.275848 1707070 cri.go:89] found id: ""
	I1124 09:28:51.275861 1707070 logs.go:282] 0 containers: []
	W1124 09:28:51.275868 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:51.275874 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:51.275929 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:51.304356 1707070 cri.go:89] found id: ""
	I1124 09:28:51.304370 1707070 logs.go:282] 0 containers: []
	W1124 09:28:51.304386 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:51.304392 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:51.304450 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:51.329000 1707070 cri.go:89] found id: ""
	I1124 09:28:51.329015 1707070 logs.go:282] 0 containers: []
	W1124 09:28:51.329021 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:51.329027 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:51.329099 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:51.357783 1707070 cri.go:89] found id: ""
	I1124 09:28:51.357796 1707070 logs.go:282] 0 containers: []
	W1124 09:28:51.357804 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:51.357811 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:51.357820 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:51.426561 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:51.426582 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:51.456185 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:51.456202 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:51.512504 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:51.512525 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:51.530860 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:51.530877 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:51.596556 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:51.586703   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:51.587508   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:51.589233   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:51.589675   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:51.591800   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:51.586703   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:51.587508   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:51.589233   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:51.589675   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:51.591800   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:54.097448 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:54.107646 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:54.107710 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:54.131850 1707070 cri.go:89] found id: ""
	I1124 09:28:54.131869 1707070 logs.go:282] 0 containers: []
	W1124 09:28:54.131877 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:54.131883 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:54.131950 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:54.157778 1707070 cri.go:89] found id: ""
	I1124 09:28:54.157793 1707070 logs.go:282] 0 containers: []
	W1124 09:28:54.157800 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:54.157806 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:54.157871 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:54.183638 1707070 cri.go:89] found id: ""
	I1124 09:28:54.183661 1707070 logs.go:282] 0 containers: []
	W1124 09:28:54.183668 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:54.183676 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:54.183745 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:54.208654 1707070 cri.go:89] found id: ""
	I1124 09:28:54.208668 1707070 logs.go:282] 0 containers: []
	W1124 09:28:54.208675 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:54.208680 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:54.208741 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:54.237302 1707070 cri.go:89] found id: ""
	I1124 09:28:54.237317 1707070 logs.go:282] 0 containers: []
	W1124 09:28:54.237325 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:54.237331 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:54.237390 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:54.261089 1707070 cri.go:89] found id: ""
	I1124 09:28:54.261111 1707070 logs.go:282] 0 containers: []
	W1124 09:28:54.261119 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:54.261124 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:54.261195 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:54.289315 1707070 cri.go:89] found id: ""
	I1124 09:28:54.289337 1707070 logs.go:282] 0 containers: []
	W1124 09:28:54.289345 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:54.289353 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:54.289363 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:54.350840 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:54.350861 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:54.391880 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:54.391897 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:54.457044 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:54.457066 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:54.475507 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:54.475525 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:54.538358 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:54.529952   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:54.530805   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:54.531583   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:54.533115   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:54.533777   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:54.529952   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:54.530805   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:54.531583   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:54.533115   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:54.533777   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:57.040068 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:57.050642 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:57.050707 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:57.075811 1707070 cri.go:89] found id: ""
	I1124 09:28:57.075824 1707070 logs.go:282] 0 containers: []
	W1124 09:28:57.075832 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:57.075837 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:57.075899 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:57.106029 1707070 cri.go:89] found id: ""
	I1124 09:28:57.106044 1707070 logs.go:282] 0 containers: []
	W1124 09:28:57.106052 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:57.106058 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:57.106114 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:57.132742 1707070 cri.go:89] found id: ""
	I1124 09:28:57.132756 1707070 logs.go:282] 0 containers: []
	W1124 09:28:57.132763 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:57.132768 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:57.132825 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:57.156809 1707070 cri.go:89] found id: ""
	I1124 09:28:57.156823 1707070 logs.go:282] 0 containers: []
	W1124 09:28:57.156830 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:57.156835 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:57.156898 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:57.182649 1707070 cri.go:89] found id: ""
	I1124 09:28:57.182663 1707070 logs.go:282] 0 containers: []
	W1124 09:28:57.182670 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:57.182676 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:57.182733 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:57.206184 1707070 cri.go:89] found id: ""
	I1124 09:28:57.206198 1707070 logs.go:282] 0 containers: []
	W1124 09:28:57.206205 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:57.206211 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:57.206275 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:57.230629 1707070 cri.go:89] found id: ""
	I1124 09:28:57.230643 1707070 logs.go:282] 0 containers: []
	W1124 09:28:57.230651 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:57.230660 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:57.230670 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:57.287168 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:57.287187 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:57.304021 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:57.304037 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:57.368613 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:57.361126   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:57.361623   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:57.363259   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:57.363659   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:57.365140   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:57.361126   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:57.361623   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:57.363259   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:57.363659   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:57.365140   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:57.368624 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:57.368635 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:57.439834 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:57.439854 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:59.971306 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:59.982006 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:59.982066 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:00.016934 1707070 cri.go:89] found id: ""
	I1124 09:29:00.016951 1707070 logs.go:282] 0 containers: []
	W1124 09:29:00.016966 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:00.016973 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:00.017049 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:00.103638 1707070 cri.go:89] found id: ""
	I1124 09:29:00.103654 1707070 logs.go:282] 0 containers: []
	W1124 09:29:00.103663 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:00.103669 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:00.103740 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:00.170246 1707070 cri.go:89] found id: ""
	I1124 09:29:00.170264 1707070 logs.go:282] 0 containers: []
	W1124 09:29:00.170273 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:00.170280 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:00.170350 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:00.236365 1707070 cri.go:89] found id: ""
	I1124 09:29:00.236382 1707070 logs.go:282] 0 containers: []
	W1124 09:29:00.236390 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:00.236397 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:00.236474 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:00.304007 1707070 cri.go:89] found id: ""
	I1124 09:29:00.304026 1707070 logs.go:282] 0 containers: []
	W1124 09:29:00.304036 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:00.304048 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:00.304139 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:00.347892 1707070 cri.go:89] found id: ""
	I1124 09:29:00.347907 1707070 logs.go:282] 0 containers: []
	W1124 09:29:00.347916 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:00.347924 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:00.348047 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:00.392276 1707070 cri.go:89] found id: ""
	I1124 09:29:00.392292 1707070 logs.go:282] 0 containers: []
	W1124 09:29:00.392304 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:00.392314 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:00.392328 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:00.445097 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:00.445118 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:00.507903 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:00.507923 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:00.532762 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:00.532787 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:00.603329 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:00.595058   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:00.595595   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:00.597748   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:00.598425   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:00.599635   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:00.595058   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:00.595595   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:00.597748   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:00.598425   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:00.599635   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:00.603341 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:00.603352 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:03.164630 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:03.174868 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:03.174928 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:03.198952 1707070 cri.go:89] found id: ""
	I1124 09:29:03.198966 1707070 logs.go:282] 0 containers: []
	W1124 09:29:03.198973 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:03.198979 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:03.199038 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:03.228049 1707070 cri.go:89] found id: ""
	I1124 09:29:03.228063 1707070 logs.go:282] 0 containers: []
	W1124 09:29:03.228070 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:03.228075 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:03.228133 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:03.253873 1707070 cri.go:89] found id: ""
	I1124 09:29:03.253888 1707070 logs.go:282] 0 containers: []
	W1124 09:29:03.253895 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:03.253901 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:03.253969 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:03.277874 1707070 cri.go:89] found id: ""
	I1124 09:29:03.277889 1707070 logs.go:282] 0 containers: []
	W1124 09:29:03.277903 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:03.277909 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:03.277966 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:03.306311 1707070 cri.go:89] found id: ""
	I1124 09:29:03.306333 1707070 logs.go:282] 0 containers: []
	W1124 09:29:03.306340 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:03.306345 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:03.306402 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:03.330412 1707070 cri.go:89] found id: ""
	I1124 09:29:03.330425 1707070 logs.go:282] 0 containers: []
	W1124 09:29:03.330432 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:03.330438 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:03.330572 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:03.359087 1707070 cri.go:89] found id: ""
	I1124 09:29:03.359101 1707070 logs.go:282] 0 containers: []
	W1124 09:29:03.359108 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:03.359116 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:03.359125 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:03.430996 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:03.431015 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:03.467444 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:03.467460 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:03.526316 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:03.526336 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:03.543233 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:03.543250 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:03.605146 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:03.596435   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:03.597161   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:03.598917   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:03.599598   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:03.601425   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:03.596435   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:03.597161   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:03.598917   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:03.599598   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:03.601425   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:06.105406 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:06.116034 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:06.116093 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:06.140111 1707070 cri.go:89] found id: ""
	I1124 09:29:06.140125 1707070 logs.go:282] 0 containers: []
	W1124 09:29:06.140132 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:06.140137 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:06.140195 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:06.164893 1707070 cri.go:89] found id: ""
	I1124 09:29:06.164907 1707070 logs.go:282] 0 containers: []
	W1124 09:29:06.164914 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:06.164920 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:06.164979 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:06.190122 1707070 cri.go:89] found id: ""
	I1124 09:29:06.190137 1707070 logs.go:282] 0 containers: []
	W1124 09:29:06.190144 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:06.190149 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:06.190206 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:06.215548 1707070 cri.go:89] found id: ""
	I1124 09:29:06.215562 1707070 logs.go:282] 0 containers: []
	W1124 09:29:06.215569 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:06.215575 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:06.215630 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:06.239566 1707070 cri.go:89] found id: ""
	I1124 09:29:06.239592 1707070 logs.go:282] 0 containers: []
	W1124 09:29:06.239600 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:06.239605 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:06.239662 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:06.266190 1707070 cri.go:89] found id: ""
	I1124 09:29:06.266223 1707070 logs.go:282] 0 containers: []
	W1124 09:29:06.266232 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:06.266237 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:06.266301 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:06.289910 1707070 cri.go:89] found id: ""
	I1124 09:29:06.289923 1707070 logs.go:282] 0 containers: []
	W1124 09:29:06.289930 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:06.289939 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:06.289955 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:06.353044 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:06.345412   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:06.345855   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:06.347499   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:06.347885   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:06.349511   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:06.345412   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:06.345855   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:06.347499   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:06.347885   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:06.349511   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:06.353054 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:06.353068 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:06.420094 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:06.420114 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:06.452708 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:06.452724 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:06.508689 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:06.508708 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:09.026433 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:09.036862 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:09.036926 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:09.061951 1707070 cri.go:89] found id: ""
	I1124 09:29:09.061965 1707070 logs.go:282] 0 containers: []
	W1124 09:29:09.061972 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:09.061977 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:09.062035 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:09.087954 1707070 cri.go:89] found id: ""
	I1124 09:29:09.087968 1707070 logs.go:282] 0 containers: []
	W1124 09:29:09.087976 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:09.087981 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:09.088044 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:09.112784 1707070 cri.go:89] found id: ""
	I1124 09:29:09.112798 1707070 logs.go:282] 0 containers: []
	W1124 09:29:09.112805 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:09.112810 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:09.112869 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:09.137324 1707070 cri.go:89] found id: ""
	I1124 09:29:09.137339 1707070 logs.go:282] 0 containers: []
	W1124 09:29:09.137347 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:09.137353 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:09.137413 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:09.162408 1707070 cri.go:89] found id: ""
	I1124 09:29:09.162422 1707070 logs.go:282] 0 containers: []
	W1124 09:29:09.162430 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:09.162435 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:09.162513 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:09.191279 1707070 cri.go:89] found id: ""
	I1124 09:29:09.191293 1707070 logs.go:282] 0 containers: []
	W1124 09:29:09.191300 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:09.191305 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:09.191361 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:09.214616 1707070 cri.go:89] found id: ""
	I1124 09:29:09.214630 1707070 logs.go:282] 0 containers: []
	W1124 09:29:09.214637 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:09.214645 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:09.214657 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:09.270146 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:09.270164 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:09.287320 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:09.287340 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:09.352488 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:09.344015   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:09.344642   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:09.346617   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:09.347280   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:09.348952   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:09.344015   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:09.344642   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:09.346617   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:09.347280   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:09.348952   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:09.352499 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:09.352510 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:09.418511 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:09.418532 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:11.954969 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:11.967024 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:11.967089 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:11.990717 1707070 cri.go:89] found id: ""
	I1124 09:29:11.990733 1707070 logs.go:282] 0 containers: []
	W1124 09:29:11.990741 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:11.990746 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:11.990809 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:12.020399 1707070 cri.go:89] found id: ""
	I1124 09:29:12.020413 1707070 logs.go:282] 0 containers: []
	W1124 09:29:12.020421 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:12.020427 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:12.020495 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:12.047081 1707070 cri.go:89] found id: ""
	I1124 09:29:12.047105 1707070 logs.go:282] 0 containers: []
	W1124 09:29:12.047114 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:12.047120 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:12.047185 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:12.072046 1707070 cri.go:89] found id: ""
	I1124 09:29:12.072060 1707070 logs.go:282] 0 containers: []
	W1124 09:29:12.072068 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:12.072074 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:12.072131 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:12.103533 1707070 cri.go:89] found id: ""
	I1124 09:29:12.103547 1707070 logs.go:282] 0 containers: []
	W1124 09:29:12.103554 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:12.103559 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:12.103619 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:12.131885 1707070 cri.go:89] found id: ""
	I1124 09:29:12.131900 1707070 logs.go:282] 0 containers: []
	W1124 09:29:12.131908 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:12.131914 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:12.131977 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:12.156166 1707070 cri.go:89] found id: ""
	I1124 09:29:12.156180 1707070 logs.go:282] 0 containers: []
	W1124 09:29:12.156187 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:12.156195 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:12.156206 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:12.184115 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:12.184131 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:12.239534 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:12.239553 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:12.256920 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:12.256937 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:12.322513 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:12.315053   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:12.315552   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:12.317173   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:12.317659   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:12.319113   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:12.315053   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:12.315552   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:12.317173   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:12.317659   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:12.319113   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:12.322536 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:12.322546 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:14.891198 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:14.901386 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:14.901446 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:14.926318 1707070 cri.go:89] found id: ""
	I1124 09:29:14.926340 1707070 logs.go:282] 0 containers: []
	W1124 09:29:14.926347 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:14.926353 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:14.926413 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:14.955083 1707070 cri.go:89] found id: ""
	I1124 09:29:14.955097 1707070 logs.go:282] 0 containers: []
	W1124 09:29:14.955104 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:14.955110 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:14.955167 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:14.979745 1707070 cri.go:89] found id: ""
	I1124 09:29:14.979758 1707070 logs.go:282] 0 containers: []
	W1124 09:29:14.979766 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:14.979771 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:14.979829 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:15.004845 1707070 cri.go:89] found id: ""
	I1124 09:29:15.004861 1707070 logs.go:282] 0 containers: []
	W1124 09:29:15.004869 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:15.004875 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:15.004952 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:15.044211 1707070 cri.go:89] found id: ""
	I1124 09:29:15.044225 1707070 logs.go:282] 0 containers: []
	W1124 09:29:15.044237 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:15.044243 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:15.044330 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:15.075656 1707070 cri.go:89] found id: ""
	I1124 09:29:15.075669 1707070 logs.go:282] 0 containers: []
	W1124 09:29:15.075677 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:15.075682 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:15.075740 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:15.101378 1707070 cri.go:89] found id: ""
	I1124 09:29:15.101392 1707070 logs.go:282] 0 containers: []
	W1124 09:29:15.101400 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:15.101408 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:15.101418 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:15.159297 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:15.159316 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:15.176523 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:15.176541 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:15.242899 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:15.234359   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:15.235294   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:15.237104   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:15.237675   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:15.239169   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:15.234359   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:15.235294   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:15.237104   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:15.237675   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:15.239169   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:15.242909 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:15.242919 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:15.304297 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:15.304319 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:17.833530 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:17.843418 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:17.843476 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:17.867779 1707070 cri.go:89] found id: ""
	I1124 09:29:17.867793 1707070 logs.go:282] 0 containers: []
	W1124 09:29:17.867806 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:17.867811 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:17.867866 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:17.891077 1707070 cri.go:89] found id: ""
	I1124 09:29:17.891090 1707070 logs.go:282] 0 containers: []
	W1124 09:29:17.891098 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:17.891103 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:17.891187 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:17.915275 1707070 cri.go:89] found id: ""
	I1124 09:29:17.915289 1707070 logs.go:282] 0 containers: []
	W1124 09:29:17.915296 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:17.915301 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:17.915357 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:17.943098 1707070 cri.go:89] found id: ""
	I1124 09:29:17.943111 1707070 logs.go:282] 0 containers: []
	W1124 09:29:17.943119 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:17.943124 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:17.943186 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:17.968417 1707070 cri.go:89] found id: ""
	I1124 09:29:17.968430 1707070 logs.go:282] 0 containers: []
	W1124 09:29:17.968437 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:17.968443 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:17.968501 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:17.993301 1707070 cri.go:89] found id: ""
	I1124 09:29:17.993315 1707070 logs.go:282] 0 containers: []
	W1124 09:29:17.993322 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:17.993328 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:17.993385 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:18.021715 1707070 cri.go:89] found id: ""
	I1124 09:29:18.021730 1707070 logs.go:282] 0 containers: []
	W1124 09:29:18.021738 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:18.021746 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:18.021756 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:18.085324 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:18.085345 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:18.118128 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:18.118159 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:18.182148 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:18.182171 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:18.199970 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:18.199990 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:18.266928 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:18.258137   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:18.258818   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:18.260418   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:18.261036   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:18.262678   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:18.258137   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:18.258818   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:18.260418   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:18.261036   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:18.262678   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:20.768145 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:20.780890 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:20.780956 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:20.807227 1707070 cri.go:89] found id: ""
	I1124 09:29:20.807241 1707070 logs.go:282] 0 containers: []
	W1124 09:29:20.807248 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:20.807253 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:20.807317 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:20.836452 1707070 cri.go:89] found id: ""
	I1124 09:29:20.836466 1707070 logs.go:282] 0 containers: []
	W1124 09:29:20.836473 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:20.836478 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:20.836535 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:20.861534 1707070 cri.go:89] found id: ""
	I1124 09:29:20.861549 1707070 logs.go:282] 0 containers: []
	W1124 09:29:20.861556 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:20.861561 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:20.861620 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:20.890181 1707070 cri.go:89] found id: ""
	I1124 09:29:20.890196 1707070 logs.go:282] 0 containers: []
	W1124 09:29:20.890203 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:20.890209 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:20.890278 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:20.919882 1707070 cri.go:89] found id: ""
	I1124 09:29:20.919897 1707070 logs.go:282] 0 containers: []
	W1124 09:29:20.919904 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:20.919910 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:20.919973 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:20.948347 1707070 cri.go:89] found id: ""
	I1124 09:29:20.948361 1707070 logs.go:282] 0 containers: []
	W1124 09:29:20.948368 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:20.948373 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:20.948428 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:20.972834 1707070 cri.go:89] found id: ""
	I1124 09:29:20.972847 1707070 logs.go:282] 0 containers: []
	W1124 09:29:20.972855 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:20.972862 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:20.972873 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:21.029330 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:21.029350 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:21.046983 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:21.047000 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:21.112004 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:21.104171   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:21.104918   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:21.106573   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:21.107127   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:21.108653   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:21.104171   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:21.104918   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:21.106573   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:21.107127   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:21.108653   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:21.112015 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:21.112025 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:21.174850 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:21.174870 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:23.702609 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:23.712856 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:23.712939 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:23.741964 1707070 cri.go:89] found id: ""
	I1124 09:29:23.741978 1707070 logs.go:282] 0 containers: []
	W1124 09:29:23.741985 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:23.741991 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:23.742067 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:23.766952 1707070 cri.go:89] found id: ""
	I1124 09:29:23.766966 1707070 logs.go:282] 0 containers: []
	W1124 09:29:23.766972 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:23.766978 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:23.767035 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:23.790992 1707070 cri.go:89] found id: ""
	I1124 09:29:23.791005 1707070 logs.go:282] 0 containers: []
	W1124 09:29:23.791013 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:23.791018 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:23.791073 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:23.819700 1707070 cri.go:89] found id: ""
	I1124 09:29:23.819713 1707070 logs.go:282] 0 containers: []
	W1124 09:29:23.819720 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:23.819726 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:23.819786 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:23.848657 1707070 cri.go:89] found id: ""
	I1124 09:29:23.848683 1707070 logs.go:282] 0 containers: []
	W1124 09:29:23.848690 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:23.848695 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:23.848754 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:23.873546 1707070 cri.go:89] found id: ""
	I1124 09:29:23.873571 1707070 logs.go:282] 0 containers: []
	W1124 09:29:23.873578 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:23.873584 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:23.873654 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:23.899519 1707070 cri.go:89] found id: ""
	I1124 09:29:23.899533 1707070 logs.go:282] 0 containers: []
	W1124 09:29:23.899547 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:23.899556 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:23.899568 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:23.954834 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:23.954854 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:23.971662 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:23.971680 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:24.041660 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:24.033560   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:24.034352   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:24.036062   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:24.036417   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:24.038032   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:24.033560   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:24.034352   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:24.036062   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:24.036417   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:24.038032   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:24.041670 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:24.041681 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:24.105146 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:24.105168 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:26.634760 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:26.646166 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:26.646251 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:26.679257 1707070 cri.go:89] found id: ""
	I1124 09:29:26.679271 1707070 logs.go:282] 0 containers: []
	W1124 09:29:26.679279 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:26.679284 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:26.679344 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:26.710754 1707070 cri.go:89] found id: ""
	I1124 09:29:26.710768 1707070 logs.go:282] 0 containers: []
	W1124 09:29:26.710775 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:26.710782 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:26.710840 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:26.735831 1707070 cri.go:89] found id: ""
	I1124 09:29:26.735845 1707070 logs.go:282] 0 containers: []
	W1124 09:29:26.735852 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:26.735857 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:26.735926 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:26.759918 1707070 cri.go:89] found id: ""
	I1124 09:29:26.759932 1707070 logs.go:282] 0 containers: []
	W1124 09:29:26.759939 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:26.759947 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:26.760002 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:26.783806 1707070 cri.go:89] found id: ""
	I1124 09:29:26.783825 1707070 logs.go:282] 0 containers: []
	W1124 09:29:26.783832 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:26.783838 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:26.783895 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:26.809230 1707070 cri.go:89] found id: ""
	I1124 09:29:26.809244 1707070 logs.go:282] 0 containers: []
	W1124 09:29:26.809252 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:26.809266 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:26.809331 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:26.836902 1707070 cri.go:89] found id: ""
	I1124 09:29:26.836916 1707070 logs.go:282] 0 containers: []
	W1124 09:29:26.836923 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:26.836931 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:26.836942 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:26.853955 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:26.853978 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:26.916186 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:26.907929   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:26.908672   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:26.910345   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:26.910937   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:26.912681   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:26.907929   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:26.908672   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:26.910345   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:26.910937   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:26.912681   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:26.916196 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:26.916218 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:26.980050 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:26.980072 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:27.010821 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:27.010838 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:29.573482 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:29.583518 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:29.583582 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:29.608188 1707070 cri.go:89] found id: ""
	I1124 09:29:29.608202 1707070 logs.go:282] 0 containers: []
	W1124 09:29:29.608209 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:29.608214 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:29.608270 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:29.641187 1707070 cri.go:89] found id: ""
	I1124 09:29:29.641201 1707070 logs.go:282] 0 containers: []
	W1124 09:29:29.641209 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:29.641214 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:29.641282 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:29.674249 1707070 cri.go:89] found id: ""
	I1124 09:29:29.674269 1707070 logs.go:282] 0 containers: []
	W1124 09:29:29.674276 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:29.674282 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:29.674339 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:29.700355 1707070 cri.go:89] found id: ""
	I1124 09:29:29.700370 1707070 logs.go:282] 0 containers: []
	W1124 09:29:29.700377 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:29.700382 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:29.700438 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:29.729232 1707070 cri.go:89] found id: ""
	I1124 09:29:29.729246 1707070 logs.go:282] 0 containers: []
	W1124 09:29:29.729253 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:29.729257 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:29.729313 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:29.756753 1707070 cri.go:89] found id: ""
	I1124 09:29:29.756766 1707070 logs.go:282] 0 containers: []
	W1124 09:29:29.756773 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:29.756788 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:29.756849 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:29.782318 1707070 cri.go:89] found id: ""
	I1124 09:29:29.782332 1707070 logs.go:282] 0 containers: []
	W1124 09:29:29.782339 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:29.782347 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:29.782358 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:29.837944 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:29.837963 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:29.855075 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:29.855094 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:29.916212 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:29.907972   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:29.908745   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:29.910447   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:29.910972   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:29.912670   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:29.907972   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:29.908745   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:29.910447   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:29.910972   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:29.912670   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:29.916221 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:29.916232 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:29.978681 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:29.978703 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:32.530833 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:32.541146 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:32.541251 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:32.566525 1707070 cri.go:89] found id: ""
	I1124 09:29:32.566540 1707070 logs.go:282] 0 containers: []
	W1124 09:29:32.566548 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:32.566554 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:32.566622 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:32.591741 1707070 cri.go:89] found id: ""
	I1124 09:29:32.591756 1707070 logs.go:282] 0 containers: []
	W1124 09:29:32.591763 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:32.591768 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:32.591826 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:32.617127 1707070 cri.go:89] found id: ""
	I1124 09:29:32.617141 1707070 logs.go:282] 0 containers: []
	W1124 09:29:32.617148 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:32.617153 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:32.617209 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:32.654493 1707070 cri.go:89] found id: ""
	I1124 09:29:32.654507 1707070 logs.go:282] 0 containers: []
	W1124 09:29:32.654515 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:32.654521 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:32.654580 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:32.685080 1707070 cri.go:89] found id: ""
	I1124 09:29:32.685094 1707070 logs.go:282] 0 containers: []
	W1124 09:29:32.685101 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:32.685106 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:32.685180 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:32.715751 1707070 cri.go:89] found id: ""
	I1124 09:29:32.715766 1707070 logs.go:282] 0 containers: []
	W1124 09:29:32.715782 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:32.715788 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:32.715850 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:32.742395 1707070 cri.go:89] found id: ""
	I1124 09:29:32.742409 1707070 logs.go:282] 0 containers: []
	W1124 09:29:32.742416 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:32.742424 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:32.742434 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:32.760261 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:32.760278 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:32.828736 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:32.819577   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:32.820328   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:32.822013   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:32.822622   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:32.824506   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:32.819577   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:32.820328   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:32.822013   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:32.822622   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:32.824506   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:32.828746 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:32.828759 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:32.896940 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:32.896965 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:32.928695 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:32.928711 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:35.485941 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:35.496873 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:35.496934 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:35.525748 1707070 cri.go:89] found id: ""
	I1124 09:29:35.525782 1707070 logs.go:282] 0 containers: []
	W1124 09:29:35.525791 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:35.525796 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:35.525866 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:35.553111 1707070 cri.go:89] found id: ""
	I1124 09:29:35.553126 1707070 logs.go:282] 0 containers: []
	W1124 09:29:35.553134 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:35.553142 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:35.553220 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:35.578594 1707070 cri.go:89] found id: ""
	I1124 09:29:35.578622 1707070 logs.go:282] 0 containers: []
	W1124 09:29:35.578629 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:35.578635 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:35.578706 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:35.607322 1707070 cri.go:89] found id: ""
	I1124 09:29:35.607336 1707070 logs.go:282] 0 containers: []
	W1124 09:29:35.607343 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:35.607348 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:35.607417 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:35.638865 1707070 cri.go:89] found id: ""
	I1124 09:29:35.638880 1707070 logs.go:282] 0 containers: []
	W1124 09:29:35.638887 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:35.638893 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:35.638960 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:35.672327 1707070 cri.go:89] found id: ""
	I1124 09:29:35.672352 1707070 logs.go:282] 0 containers: []
	W1124 09:29:35.672360 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:35.672365 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:35.672431 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:35.700255 1707070 cri.go:89] found id: ""
	I1124 09:29:35.700269 1707070 logs.go:282] 0 containers: []
	W1124 09:29:35.700277 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:35.700285 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:35.700297 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:35.758017 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:35.758037 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:35.775326 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:35.775344 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:35.842090 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:35.833688   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:35.834400   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:35.836148   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:35.836802   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:35.838521   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:35.833688   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:35.834400   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:35.836148   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:35.836802   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:35.838521   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:35.842100 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:35.842120 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:35.908742 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:35.908769 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:38.443689 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:38.453968 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:38.454035 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:38.477762 1707070 cri.go:89] found id: ""
	I1124 09:29:38.477776 1707070 logs.go:282] 0 containers: []
	W1124 09:29:38.477783 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:38.477789 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:38.477853 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:38.506120 1707070 cri.go:89] found id: ""
	I1124 09:29:38.506134 1707070 logs.go:282] 0 containers: []
	W1124 09:29:38.506141 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:38.506147 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:38.506203 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:38.530669 1707070 cri.go:89] found id: ""
	I1124 09:29:38.530691 1707070 logs.go:282] 0 containers: []
	W1124 09:29:38.530699 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:38.530705 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:38.530763 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:38.560535 1707070 cri.go:89] found id: ""
	I1124 09:29:38.560558 1707070 logs.go:282] 0 containers: []
	W1124 09:29:38.560565 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:38.560572 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:38.560631 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:38.586535 1707070 cri.go:89] found id: ""
	I1124 09:29:38.586549 1707070 logs.go:282] 0 containers: []
	W1124 09:29:38.586556 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:38.586561 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:38.586620 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:38.611101 1707070 cri.go:89] found id: ""
	I1124 09:29:38.611115 1707070 logs.go:282] 0 containers: []
	W1124 09:29:38.611122 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:38.611127 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:38.611186 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:38.643467 1707070 cri.go:89] found id: ""
	I1124 09:29:38.643482 1707070 logs.go:282] 0 containers: []
	W1124 09:29:38.643489 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:38.643497 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:38.643508 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:38.708197 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:38.708218 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:38.725978 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:38.725995 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:38.789806 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:38.781672   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:38.782397   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:38.783993   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:38.784577   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:38.786135   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:38.781672   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:38.782397   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:38.783993   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:38.784577   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:38.786135   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:38.789818 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:38.789828 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:38.853085 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:38.853106 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:41.387044 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:41.398117 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:41.398183 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:41.424537 1707070 cri.go:89] found id: ""
	I1124 09:29:41.424551 1707070 logs.go:282] 0 containers: []
	W1124 09:29:41.424558 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:41.424564 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:41.424626 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:41.454716 1707070 cri.go:89] found id: ""
	I1124 09:29:41.454730 1707070 logs.go:282] 0 containers: []
	W1124 09:29:41.454737 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:41.454742 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:41.454801 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:41.479954 1707070 cri.go:89] found id: ""
	I1124 09:29:41.479969 1707070 logs.go:282] 0 containers: []
	W1124 09:29:41.479976 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:41.479981 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:41.480041 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:41.505560 1707070 cri.go:89] found id: ""
	I1124 09:29:41.505575 1707070 logs.go:282] 0 containers: []
	W1124 09:29:41.505582 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:41.505593 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:41.505654 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:41.530996 1707070 cri.go:89] found id: ""
	I1124 09:29:41.531010 1707070 logs.go:282] 0 containers: []
	W1124 09:29:41.531018 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:41.531024 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:41.531090 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:41.557489 1707070 cri.go:89] found id: ""
	I1124 09:29:41.557502 1707070 logs.go:282] 0 containers: []
	W1124 09:29:41.557510 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:41.557516 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:41.557575 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:41.587178 1707070 cri.go:89] found id: ""
	I1124 09:29:41.587192 1707070 logs.go:282] 0 containers: []
	W1124 09:29:41.587199 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:41.587207 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:41.587217 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:41.644853 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:41.644873 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:41.664905 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:41.664924 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:41.731530 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:41.723947   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:41.724430   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:41.726128   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:41.726440   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:41.727892   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:41.723947   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:41.724430   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:41.726128   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:41.726440   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:41.727892   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:41.731540 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:41.731550 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:41.793965 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:41.793985 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:44.323959 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:44.334291 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:44.334352 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:44.364183 1707070 cri.go:89] found id: ""
	I1124 09:29:44.364199 1707070 logs.go:282] 0 containers: []
	W1124 09:29:44.364206 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:44.364212 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:44.364285 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:44.391116 1707070 cri.go:89] found id: ""
	I1124 09:29:44.391130 1707070 logs.go:282] 0 containers: []
	W1124 09:29:44.391137 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:44.391142 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:44.391199 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:44.416448 1707070 cri.go:89] found id: ""
	I1124 09:29:44.416462 1707070 logs.go:282] 0 containers: []
	W1124 09:29:44.416470 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:44.416476 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:44.416533 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:44.442027 1707070 cri.go:89] found id: ""
	I1124 09:29:44.442042 1707070 logs.go:282] 0 containers: []
	W1124 09:29:44.442059 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:44.442065 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:44.442124 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:44.467492 1707070 cri.go:89] found id: ""
	I1124 09:29:44.467516 1707070 logs.go:282] 0 containers: []
	W1124 09:29:44.467525 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:44.467531 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:44.467643 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:44.492900 1707070 cri.go:89] found id: ""
	I1124 09:29:44.492914 1707070 logs.go:282] 0 containers: []
	W1124 09:29:44.492921 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:44.492927 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:44.492986 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:44.518419 1707070 cri.go:89] found id: ""
	I1124 09:29:44.518434 1707070 logs.go:282] 0 containers: []
	W1124 09:29:44.518441 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:44.518449 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:44.518479 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:44.584407 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:44.584427 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:44.616287 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:44.616305 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:44.680013 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:44.680033 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:44.702644 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:44.702662 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:44.770803 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:44.761924   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:44.762682   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:44.764417   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:44.765036   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:44.766673   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:44.761924   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:44.762682   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:44.764417   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:44.765036   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:44.766673   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:47.271699 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:47.283580 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:47.283646 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:47.309341 1707070 cri.go:89] found id: ""
	I1124 09:29:47.309355 1707070 logs.go:282] 0 containers: []
	W1124 09:29:47.309368 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:47.309385 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:47.309443 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:47.335187 1707070 cri.go:89] found id: ""
	I1124 09:29:47.335202 1707070 logs.go:282] 0 containers: []
	W1124 09:29:47.335209 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:47.335214 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:47.335273 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:47.362876 1707070 cri.go:89] found id: ""
	I1124 09:29:47.362891 1707070 logs.go:282] 0 containers: []
	W1124 09:29:47.362898 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:47.362904 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:47.362964 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:47.388290 1707070 cri.go:89] found id: ""
	I1124 09:29:47.388304 1707070 logs.go:282] 0 containers: []
	W1124 09:29:47.388311 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:47.388317 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:47.388374 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:47.416544 1707070 cri.go:89] found id: ""
	I1124 09:29:47.416558 1707070 logs.go:282] 0 containers: []
	W1124 09:29:47.416565 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:47.416570 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:47.416629 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:47.441861 1707070 cri.go:89] found id: ""
	I1124 09:29:47.441875 1707070 logs.go:282] 0 containers: []
	W1124 09:29:47.441902 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:47.441909 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:47.441978 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:47.465857 1707070 cri.go:89] found id: ""
	I1124 09:29:47.465879 1707070 logs.go:282] 0 containers: []
	W1124 09:29:47.465886 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:47.465894 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:47.465905 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:47.523429 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:47.523450 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:47.540445 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:47.540462 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:47.607683 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:47.599524   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:47.600165   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:47.601865   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:47.602402   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:47.603965   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:47.599524   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:47.600165   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:47.601865   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:47.602402   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:47.603965   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:47.607694 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:47.607704 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:47.682000 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:47.682023 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:50.218599 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:50.229182 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:50.229254 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:50.254129 1707070 cri.go:89] found id: ""
	I1124 09:29:50.254143 1707070 logs.go:282] 0 containers: []
	W1124 09:29:50.254150 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:50.254155 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:50.254219 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:50.280233 1707070 cri.go:89] found id: ""
	I1124 09:29:50.280247 1707070 logs.go:282] 0 containers: []
	W1124 09:29:50.280254 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:50.280260 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:50.280317 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:50.304403 1707070 cri.go:89] found id: ""
	I1124 09:29:50.304417 1707070 logs.go:282] 0 containers: []
	W1124 09:29:50.304424 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:50.304430 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:50.304492 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:50.329881 1707070 cri.go:89] found id: ""
	I1124 09:29:50.329897 1707070 logs.go:282] 0 containers: []
	W1124 09:29:50.329904 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:50.329910 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:50.329987 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:50.358124 1707070 cri.go:89] found id: ""
	I1124 09:29:50.358139 1707070 logs.go:282] 0 containers: []
	W1124 09:29:50.358149 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:50.358158 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:50.358246 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:50.384151 1707070 cri.go:89] found id: ""
	I1124 09:29:50.384165 1707070 logs.go:282] 0 containers: []
	W1124 09:29:50.384178 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:50.384196 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:50.384254 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:50.408884 1707070 cri.go:89] found id: ""
	I1124 09:29:50.408899 1707070 logs.go:282] 0 containers: []
	W1124 09:29:50.408906 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:50.408914 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:50.408925 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:50.464122 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:50.464147 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:50.480720 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:50.480736 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:50.544337 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:50.536334   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:50.536956   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:50.538555   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:50.539042   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:50.540634   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:50.536334   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:50.536956   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:50.538555   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:50.539042   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:50.540634   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:50.544348 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:50.544361 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:50.606972 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:50.606993 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:53.143446 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:53.154359 1707070 kubeadm.go:602] duration metric: took 4m4.065975367s to restartPrimaryControlPlane
	W1124 09:29:53.154423 1707070 out.go:285] ! Unable to restart control-plane node(s), will reset cluster: <no value>
	I1124 09:29:53.154529 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1124 09:29:53.563147 1707070 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1124 09:29:53.576942 1707070 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1124 09:29:53.584698 1707070 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1124 09:29:53.584758 1707070 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1124 09:29:53.592605 1707070 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1124 09:29:53.592613 1707070 kubeadm.go:158] found existing configuration files:
	
	I1124 09:29:53.592678 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1124 09:29:53.600460 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1124 09:29:53.600517 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1124 09:29:53.607615 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1124 09:29:53.615236 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1124 09:29:53.615293 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1124 09:29:53.622532 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1124 09:29:53.630501 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1124 09:29:53.630562 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1124 09:29:53.638386 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1124 09:29:53.646257 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1124 09:29:53.646321 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1124 09:29:53.653836 1707070 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1124 09:29:53.692708 1707070 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-beta.0
	I1124 09:29:53.692756 1707070 kubeadm.go:319] [preflight] Running pre-flight checks
	I1124 09:29:53.765347 1707070 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1124 09:29:53.765413 1707070 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1124 09:29:53.765447 1707070 kubeadm.go:319] OS: Linux
	I1124 09:29:53.765490 1707070 kubeadm.go:319] CGROUPS_CPU: enabled
	I1124 09:29:53.765537 1707070 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1124 09:29:53.765589 1707070 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1124 09:29:53.765636 1707070 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1124 09:29:53.765682 1707070 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1124 09:29:53.765729 1707070 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1124 09:29:53.765772 1707070 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1124 09:29:53.765819 1707070 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1124 09:29:53.765864 1707070 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1124 09:29:53.828877 1707070 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1124 09:29:53.829001 1707070 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1124 09:29:53.829104 1707070 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1124 09:29:53.834791 1707070 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1124 09:29:53.838245 1707070 out.go:252]   - Generating certificates and keys ...
	I1124 09:29:53.838369 1707070 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1124 09:29:53.838434 1707070 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1124 09:29:53.838527 1707070 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1124 09:29:53.838616 1707070 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1124 09:29:53.838701 1707070 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1124 09:29:53.838784 1707070 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1124 09:29:53.838854 1707070 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1124 09:29:53.838919 1707070 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1124 09:29:53.839002 1707070 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1124 09:29:53.839386 1707070 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1124 09:29:53.839639 1707070 kubeadm.go:319] [certs] Using the existing "sa" key
	I1124 09:29:53.839706 1707070 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1124 09:29:54.545063 1707070 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1124 09:29:55.036514 1707070 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1124 09:29:55.148786 1707070 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1124 09:29:55.311399 1707070 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1124 09:29:55.656188 1707070 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1124 09:29:55.656996 1707070 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1124 09:29:55.659590 1707070 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1124 09:29:55.662658 1707070 out.go:252]   - Booting up control plane ...
	I1124 09:29:55.662786 1707070 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1124 09:29:55.662870 1707070 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1124 09:29:55.664747 1707070 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1124 09:29:55.686536 1707070 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1124 09:29:55.686657 1707070 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1124 09:29:55.694440 1707070 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1124 09:29:55.694885 1707070 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1124 09:29:55.694934 1707070 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1124 09:29:55.830944 1707070 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1124 09:29:55.831051 1707070 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1124 09:33:55.829210 1707070 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000251849s
	I1124 09:33:55.829235 1707070 kubeadm.go:319] 
	I1124 09:33:55.829291 1707070 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1124 09:33:55.829323 1707070 kubeadm.go:319] 	- The kubelet is not running
	I1124 09:33:55.829428 1707070 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1124 09:33:55.829432 1707070 kubeadm.go:319] 
	I1124 09:33:55.829536 1707070 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1124 09:33:55.829573 1707070 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1124 09:33:55.829603 1707070 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1124 09:33:55.829606 1707070 kubeadm.go:319] 
	I1124 09:33:55.833661 1707070 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1124 09:33:55.834099 1707070 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1124 09:33:55.834220 1707070 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1124 09:33:55.834508 1707070 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1124 09:33:55.834517 1707070 kubeadm.go:319] 
	I1124 09:33:55.834670 1707070 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	W1124 09:33:55.834735 1707070 out.go:285] ! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000251849s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	I1124 09:33:55.834825 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1124 09:33:56.243415 1707070 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1124 09:33:56.256462 1707070 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1124 09:33:56.256517 1707070 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1124 09:33:56.264387 1707070 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1124 09:33:56.264397 1707070 kubeadm.go:158] found existing configuration files:
	
	I1124 09:33:56.264448 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1124 09:33:56.272152 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1124 09:33:56.272210 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1124 09:33:56.279938 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1124 09:33:56.287667 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1124 09:33:56.287720 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1124 09:33:56.295096 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1124 09:33:56.302699 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1124 09:33:56.302758 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1124 09:33:56.310421 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1124 09:33:56.318128 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1124 09:33:56.318183 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1124 09:33:56.325438 1707070 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1124 09:33:56.364513 1707070 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-beta.0
	I1124 09:33:56.364563 1707070 kubeadm.go:319] [preflight] Running pre-flight checks
	I1124 09:33:56.440273 1707070 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1124 09:33:56.440340 1707070 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1124 09:33:56.440376 1707070 kubeadm.go:319] OS: Linux
	I1124 09:33:56.440420 1707070 kubeadm.go:319] CGROUPS_CPU: enabled
	I1124 09:33:56.440467 1707070 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1124 09:33:56.440513 1707070 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1124 09:33:56.440560 1707070 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1124 09:33:56.440606 1707070 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1124 09:33:56.440654 1707070 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1124 09:33:56.440697 1707070 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1124 09:33:56.440749 1707070 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1124 09:33:56.440794 1707070 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1124 09:33:56.504487 1707070 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1124 09:33:56.504590 1707070 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1124 09:33:56.504685 1707070 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1124 09:33:56.510220 1707070 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1124 09:33:56.513847 1707070 out.go:252]   - Generating certificates and keys ...
	I1124 09:33:56.513936 1707070 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1124 09:33:56.514003 1707070 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1124 09:33:56.514078 1707070 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1124 09:33:56.514137 1707070 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1124 09:33:56.514205 1707070 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1124 09:33:56.514264 1707070 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1124 09:33:56.514326 1707070 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1124 09:33:56.514386 1707070 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1124 09:33:56.514481 1707070 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1124 09:33:56.514553 1707070 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1124 09:33:56.514589 1707070 kubeadm.go:319] [certs] Using the existing "sa" key
	I1124 09:33:56.514644 1707070 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1124 09:33:57.046366 1707070 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1124 09:33:57.432965 1707070 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1124 09:33:57.802873 1707070 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1124 09:33:58.414576 1707070 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1124 09:33:58.520825 1707070 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1124 09:33:58.522049 1707070 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1124 09:33:58.526436 1707070 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1124 09:33:58.529676 1707070 out.go:252]   - Booting up control plane ...
	I1124 09:33:58.529779 1707070 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1124 09:33:58.529855 1707070 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1124 09:33:58.529921 1707070 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1124 09:33:58.549683 1707070 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1124 09:33:58.549801 1707070 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1124 09:33:58.557327 1707070 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1124 09:33:58.557589 1707070 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1124 09:33:58.557812 1707070 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1124 09:33:58.696439 1707070 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1124 09:33:58.696553 1707070 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1124 09:37:58.697446 1707070 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.001230859s
	I1124 09:37:58.697472 1707070 kubeadm.go:319] 
	I1124 09:37:58.697558 1707070 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1124 09:37:58.697602 1707070 kubeadm.go:319] 	- The kubelet is not running
	I1124 09:37:58.697730 1707070 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1124 09:37:58.697737 1707070 kubeadm.go:319] 
	I1124 09:37:58.697847 1707070 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1124 09:37:58.697878 1707070 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1124 09:37:58.697921 1707070 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1124 09:37:58.697925 1707070 kubeadm.go:319] 
	I1124 09:37:58.701577 1707070 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1124 09:37:58.701990 1707070 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1124 09:37:58.702104 1707070 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1124 09:37:58.702344 1707070 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1124 09:37:58.702350 1707070 kubeadm.go:319] 
	I1124 09:37:58.702417 1707070 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1124 09:37:58.702481 1707070 kubeadm.go:403] duration metric: took 12m9.652556415s to StartCluster
	I1124 09:37:58.702514 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:37:58.702578 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:37:58.726968 1707070 cri.go:89] found id: ""
	I1124 09:37:58.726981 1707070 logs.go:282] 0 containers: []
	W1124 09:37:58.726988 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:37:58.726994 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:37:58.727055 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:37:58.756184 1707070 cri.go:89] found id: ""
	I1124 09:37:58.756198 1707070 logs.go:282] 0 containers: []
	W1124 09:37:58.756205 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:37:58.756210 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:37:58.756266 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:37:58.781056 1707070 cri.go:89] found id: ""
	I1124 09:37:58.781070 1707070 logs.go:282] 0 containers: []
	W1124 09:37:58.781077 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:37:58.781082 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:37:58.781145 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:37:58.805769 1707070 cri.go:89] found id: ""
	I1124 09:37:58.805783 1707070 logs.go:282] 0 containers: []
	W1124 09:37:58.805790 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:37:58.805796 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:37:58.805854 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:37:58.830758 1707070 cri.go:89] found id: ""
	I1124 09:37:58.830780 1707070 logs.go:282] 0 containers: []
	W1124 09:37:58.830791 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:37:58.830797 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:37:58.830857 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:37:58.855967 1707070 cri.go:89] found id: ""
	I1124 09:37:58.855981 1707070 logs.go:282] 0 containers: []
	W1124 09:37:58.855988 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:37:58.855994 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:37:58.856051 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:37:58.890842 1707070 cri.go:89] found id: ""
	I1124 09:37:58.890857 1707070 logs.go:282] 0 containers: []
	W1124 09:37:58.890865 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:37:58.890873 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:37:58.890885 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:37:58.910142 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:37:58.910157 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:37:58.985463 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:37:58.976283   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:37:58.977104   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:37:58.978904   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:37:58.979496   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:37:58.981268   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:37:58.976283   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:37:58.977104   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:37:58.978904   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:37:58.979496   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:37:58.981268   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:37:58.985474 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:37:58.985486 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:37:59.051823 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:37:59.051845 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:37:59.080123 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:37:59.080139 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	W1124 09:37:59.137954 1707070 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001230859s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	W1124 09:37:59.138000 1707070 out.go:285] * 
	W1124 09:37:59.138117 1707070 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001230859s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1124 09:37:59.138177 1707070 out.go:285] * 
	W1124 09:37:59.140306 1707070 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1124 09:37:59.145839 1707070 out.go:203] 
	W1124 09:37:59.149636 1707070 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001230859s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1124 09:37:59.149678 1707070 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1124 09:37:59.149707 1707070 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1124 09:37:59.153358 1707070 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.268843370Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.268907855Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.268972348Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.269028340Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.269093646Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.269161708Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.269228367Z" level=info msg="runtime interface created"
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.269282981Z" level=info msg="created NRI interface"
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.269381066Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.269475385Z" level=info msg="Connect containerd service"
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.269860021Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this"
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.270611232Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config"
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.281475104Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.281548105Z" level=info msg=serving... address=/run/containerd/containerd.sock
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.281591536Z" level=info msg="Start subscribing containerd event"
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.281638691Z" level=info msg="Start recovering state"
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.310177719Z" level=info msg="Start event monitor"
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.310369614Z" level=info msg="Start cni network conf syncer for default"
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.310437783Z" level=info msg="Start streaming server"
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.310546157Z" level=info msg="Registered namespace \"k8s.io\" with NRI"
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.310605341Z" level=info msg="runtime interface starting up..."
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.310661563Z" level=info msg="starting plugins..."
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.310723160Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Nov 24 09:25:47 functional-291288 systemd[1]: Started containerd.service - containerd container runtime.
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.312804699Z" level=info msg="containerd successfully booted in 0.067611s"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:38:00.626526   21702 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:38:00.627042   21702 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:38:00.628808   21702 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:38:00.629146   21702 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:38:00.630701   21702 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Nov24 08:20] overlayfs: idmapped layers are currently not supported
	[Nov24 08:21] overlayfs: idmapped layers are currently not supported
	[Nov24 08:22] overlayfs: idmapped layers are currently not supported
	[Nov24 08:23] overlayfs: idmapped layers are currently not supported
	[Nov24 08:24] overlayfs: idmapped layers are currently not supported
	[ +19.566509] overlayfs: idmapped layers are currently not supported
	[Nov24 08:25] overlayfs: idmapped layers are currently not supported
	[ +35.934095] overlayfs: idmapped layers are currently not supported
	[Nov24 08:27] overlayfs: idmapped layers are currently not supported
	[Nov24 08:28] overlayfs: idmapped layers are currently not supported
	[Nov24 08:29] overlayfs: idmapped layers are currently not supported
	[Nov24 08:30] overlayfs: idmapped layers are currently not supported
	[Nov24 08:31] overlayfs: idmapped layers are currently not supported
	[ +44.505180] overlayfs: idmapped layers are currently not supported
	[Nov24 08:32] overlayfs: idmapped layers are currently not supported
	[Nov24 08:33] overlayfs: idmapped layers are currently not supported
	[Nov24 08:34] overlayfs: idmapped layers are currently not supported
	[ +21.137877] overlayfs: idmapped layers are currently not supported
	[ +28.724175] overlayfs: idmapped layers are currently not supported
	[Nov24 08:36] overlayfs: idmapped layers are currently not supported
	[Nov24 08:37] overlayfs: idmapped layers are currently not supported
	[Nov24 08:38] overlayfs: idmapped layers are currently not supported
	[  +4.063834] overlayfs: idmapped layers are currently not supported
	[Nov24 08:40] overlayfs: idmapped layers are currently not supported
	[Nov24 08:42] kauditd_printk_skb: 8 callbacks suppressed
	
	
	==> kernel <==
	 09:38:00 up  8:20,  0 user,  load average: 0.04, 0.14, 0.31
	Linux functional-291288 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Nov 24 09:37:57 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:37:58 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 319.
	Nov 24 09:37:58 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:37:58 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:37:58 functional-291288 kubelet[21512]: E1124 09:37:58.174689   21512 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:37:58 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:37:58 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:37:58 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 320.
	Nov 24 09:37:58 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:37:58 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:37:58 functional-291288 kubelet[21574]: E1124 09:37:58.945303   21574 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:37:58 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:37:58 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:37:59 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 321.
	Nov 24 09:37:59 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:37:59 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:37:59 functional-291288 kubelet[21618]: E1124 09:37:59.693460   21618 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:37:59 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:37:59 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:38:00 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 322.
	Nov 24 09:38:00 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:38:00 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:38:00 functional-291288 kubelet[21659]: E1124 09:38:00.453131   21659 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:38:00 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:38:00 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-291288 -n functional-291288
helpers_test.go:262: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-291288 -n functional-291288: exit status 2 (341.815682ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:262: status error: exit status 2 (may be ok)
helpers_test.go:264: "functional-291288" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ExtraConfig (737.68s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ComponentHealth (2.24s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ComponentHealth
functional_test.go:825: (dbg) Run:  kubectl --context functional-291288 get po -l tier=control-plane -n kube-system -o=json
functional_test.go:825: (dbg) Non-zero exit: kubectl --context functional-291288 get po -l tier=control-plane -n kube-system -o=json: exit status 1 (59.537704ms)

                                                
                                                
-- stdout --
	{
	    "apiVersion": "v1",
	    "items": [],
	    "kind": "List",
	    "metadata": {
	        "resourceVersion": ""
	    }
	}

                                                
                                                
-- /stdout --
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:827: failed to get components. args "kubectl --context functional-291288 get po -l tier=control-plane -n kube-system -o=json": exit status 1
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ComponentHealth]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ComponentHealth]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect functional-291288
helpers_test.go:243: (dbg) docker inspect functional-291288:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52",
	        "Created": "2025-11-24T09:10:51.896020191Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 1695240,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-11-24T09:10:51.968983407Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:572c983e466f1f784136812eef5cc59ac623db764bc7704d3676c4643993fd08",
	        "ResolvConfPath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/hostname",
	        "HostsPath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/hosts",
	        "LogPath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52-json.log",
	        "Name": "/functional-291288",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "functional-291288:/var",
	                "/lib/modules:/lib/modules:ro"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-291288",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52",
	                "LowerDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7-init/diff:/var/lib/docker/overlay2/bf294786c7ade59cb761c7bdcc27045362c3779b00a569b685443f34ade17737/diff",
	                "MergedDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7/merged",
	                "UpperDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7/diff",
	                "WorkDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "volume",
	                "Name": "functional-291288",
	                "Source": "/var/lib/docker/volumes/functional-291288/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            },
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-291288",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-291288",
	                "name.minikube.sigs.k8s.io": "functional-291288",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "09c1c2eef0dca6362dde63b4cbc372c0cfa3e4fd084b8745043d8b88925691bf",
	            "SandboxKey": "/var/run/docker/netns/09c1c2eef0dc",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34684"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34685"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34688"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34686"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34687"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-291288": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "7e:49:22:0b:f9:2c",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "1e8f91e8ad9f46b831bbb1b0589b0022d940ee9875e64a648dc80612f3ca93dc",
	                    "EndpointID": "5de5ca8ccb07584b21e6e4e30dba12e0233e8d28c3e48e705cddffe75263b337",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-291288",
	                        "70848be15fcc"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-291288 -n functional-291288
helpers_test.go:247: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-291288 -n functional-291288: exit status 2 (330.759584ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:247: status error: exit status 2 (may be ok)
helpers_test.go:252: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ComponentHealth FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ComponentHealth]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 logs -n 25
helpers_test.go:260: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ComponentHealth logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                          ARGS                                                                           │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ image   │ functional-941011 image ls --format yaml --alsologtostderr                                                                                              │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ ssh     │ functional-941011 ssh pgrep buildkitd                                                                                                                   │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │                     │
	│ image   │ functional-941011 image build -t localhost/my-image:functional-941011 testdata/build --alsologtostderr                                                  │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ image   │ functional-941011 image ls                                                                                                                              │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ image   │ functional-941011 image ls --format json --alsologtostderr                                                                                              │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ image   │ functional-941011 image ls --format table --alsologtostderr                                                                                             │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:07 UTC │ 24 Nov 25 09:07 UTC │
	│ delete  │ -p functional-941011                                                                                                                                    │ functional-941011 │ jenkins │ v1.37.0 │ 24 Nov 25 09:10 UTC │ 24 Nov 25 09:10 UTC │
	│ start   │ -p functional-291288 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0 │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:10 UTC │                     │
	│ start   │ -p functional-291288 --alsologtostderr -v=8                                                                                                             │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:19 UTC │                     │
	│ cache   │ functional-291288 cache add registry.k8s.io/pause:3.1                                                                                                   │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ cache   │ functional-291288 cache add registry.k8s.io/pause:3.3                                                                                                   │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ cache   │ functional-291288 cache add registry.k8s.io/pause:latest                                                                                                │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ cache   │ functional-291288 cache add minikube-local-cache-test:functional-291288                                                                                 │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ cache   │ functional-291288 cache delete minikube-local-cache-test:functional-291288                                                                              │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.3                                                                                                                        │ minikube          │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ cache   │ list                                                                                                                                                    │ minikube          │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ ssh     │ functional-291288 ssh sudo crictl images                                                                                                                │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ ssh     │ functional-291288 ssh sudo crictl rmi registry.k8s.io/pause:latest                                                                                      │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ ssh     │ functional-291288 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                                 │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │                     │
	│ cache   │ functional-291288 cache reload                                                                                                                          │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ ssh     │ functional-291288 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                                 │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.1                                                                                                                        │ minikube          │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ cache   │ delete registry.k8s.io/pause:latest                                                                                                                     │ minikube          │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ kubectl │ functional-291288 kubectl -- --context functional-291288 get pods                                                                                       │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │                     │
	│ start   │ -p functional-291288 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all                                                │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │                     │
	└─────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/11/24 09:25:43
	Running on machine: ip-172-31-30-239
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1124 09:25:43.956868 1707070 out.go:360] Setting OutFile to fd 1 ...
	I1124 09:25:43.957002 1707070 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:25:43.957006 1707070 out.go:374] Setting ErrFile to fd 2...
	I1124 09:25:43.957010 1707070 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:25:43.957247 1707070 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
	I1124 09:25:43.957575 1707070 out.go:368] Setting JSON to false
	I1124 09:25:43.958421 1707070 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":29273,"bootTime":1763947071,"procs":157,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1124 09:25:43.958501 1707070 start.go:143] virtualization:  
	I1124 09:25:43.961954 1707070 out.go:179] * [functional-291288] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1124 09:25:43.965745 1707070 out.go:179]   - MINIKUBE_LOCATION=21978
	I1124 09:25:43.965806 1707070 notify.go:221] Checking for updates...
	I1124 09:25:43.971831 1707070 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1124 09:25:43.974596 1707070 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 09:25:43.977531 1707070 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21978-1652607/.minikube
	I1124 09:25:43.980447 1707070 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1124 09:25:43.983266 1707070 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1124 09:25:43.986897 1707070 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1124 09:25:43.986999 1707070 driver.go:422] Setting default libvirt URI to qemu:///system
	I1124 09:25:44.009686 1707070 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1124 09:25:44.009789 1707070 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 09:25:44.075505 1707070 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-11-24 09:25:44.065719192 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 09:25:44.075607 1707070 docker.go:319] overlay module found
	I1124 09:25:44.080493 1707070 out.go:179] * Using the docker driver based on existing profile
	I1124 09:25:44.083298 1707070 start.go:309] selected driver: docker
	I1124 09:25:44.083323 1707070 start.go:927] validating driver "docker" against &{Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disa
bleCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:25:44.083409 1707070 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1124 09:25:44.083513 1707070 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 09:25:44.137525 1707070 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-11-24 09:25:44.127840235 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 09:25:44.137959 1707070 start_flags.go:992] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1124 09:25:44.137984 1707070 cni.go:84] Creating CNI manager for ""
	I1124 09:25:44.138040 1707070 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1124 09:25:44.138097 1707070 start.go:353] cluster config:
	{Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local C
ontainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disab
leCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:25:44.143064 1707070 out.go:179] * Starting "functional-291288" primary control-plane node in "functional-291288" cluster
	I1124 09:25:44.145761 1707070 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1124 09:25:44.148578 1707070 out.go:179] * Pulling base image v0.0.48-1763789673-21948 ...
	I1124 09:25:44.151418 1707070 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1124 09:25:44.151496 1707070 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f in local docker daemon
	I1124 09:25:44.171581 1707070 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f in local docker daemon, skipping pull
	I1124 09:25:44.171593 1707070 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f exists in daemon, skipping load
	W1124 09:25:44.210575 1707070 preload.go:144] https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.35.0-beta.0/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	W1124 09:25:44.425167 1707070 preload.go:144] https://github.com/kubernetes-sigs/minikube-preloads/releases/download/v18/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	I1124 09:25:44.425335 1707070 profile.go:143] Saving config to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/config.json ...
	I1124 09:25:44.425459 1707070 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:25:44.425602 1707070 cache.go:243] Successfully downloaded all kic artifacts
	I1124 09:25:44.425631 1707070 start.go:360] acquireMachinesLock for functional-291288: {Name:mk85384dc057570e1f34db593d357cea738652c4 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.425681 1707070 start.go:364] duration metric: took 28.381µs to acquireMachinesLock for "functional-291288"
	I1124 09:25:44.425694 1707070 start.go:96] Skipping create...Using existing machine configuration
	I1124 09:25:44.425698 1707070 fix.go:54] fixHost starting: 
	I1124 09:25:44.425962 1707070 cli_runner.go:164] Run: docker container inspect functional-291288 --format={{.State.Status}}
	I1124 09:25:44.443478 1707070 fix.go:112] recreateIfNeeded on functional-291288: state=Running err=<nil>
	W1124 09:25:44.443512 1707070 fix.go:138] unexpected machine state, will restart: <nil>
	I1124 09:25:44.447296 1707070 out.go:252] * Updating the running docker "functional-291288" container ...
	I1124 09:25:44.447326 1707070 machine.go:94] provisionDockerMachine start ...
	I1124 09:25:44.447405 1707070 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:25:44.465953 1707070 main.go:143] libmachine: Using SSH client type: native
	I1124 09:25:44.466284 1707070 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34684 <nil> <nil>}
	I1124 09:25:44.466291 1707070 main.go:143] libmachine: About to run SSH command:
	hostname
	I1124 09:25:44.603673 1707070 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:25:44.618572 1707070 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-291288
	
	I1124 09:25:44.618586 1707070 ubuntu.go:182] provisioning hostname "functional-291288"
	I1124 09:25:44.618668 1707070 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:25:44.659382 1707070 main.go:143] libmachine: Using SSH client type: native
	I1124 09:25:44.659732 1707070 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34684 <nil> <nil>}
	I1124 09:25:44.659741 1707070 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-291288 && echo "functional-291288" | sudo tee /etc/hostname
	I1124 09:25:44.806505 1707070 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:25:44.844189 1707070 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-291288
	
	I1124 09:25:44.844281 1707070 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:25:44.868659 1707070 main.go:143] libmachine: Using SSH client type: native
	I1124 09:25:44.869019 1707070 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34684 <nil> <nil>}
	I1124 09:25:44.869041 1707070 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-291288' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-291288/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-291288' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1124 09:25:44.979106 1707070 cache.go:107] acquiring lock: {Name:mk22a10f0ce1f3295b61e7e76c455d0494a3e278 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.979193 1707070 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 exists
	I1124 09:25:44.979201 1707070 cache.go:96] cache image "gcr.io/k8s-minikube/storage-provisioner:v5" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5" took 127.862µs
	I1124 09:25:44.979207 1707070 cache.go:80] save to tar file gcr.io/k8s-minikube/storage-provisioner:v5 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 succeeded
	I1124 09:25:44.979198 1707070 cache.go:107] acquiring lock: {Name:mk80fdbe7cdb5bc17c2a82b4ecfd00214559a435 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.979218 1707070 cache.go:107] acquiring lock: {Name:mk85f1502dbb97830776608fb729eb3605e112e6 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.979237 1707070 cache.go:107] acquiring lock: {Name:mk46ce3b59d7e062b3dbc8a90fe5b4231f256471 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.979267 1707070 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 exists
	I1124 09:25:44.979266 1707070 cache.go:107] acquiring lock: {Name:mk1cf42e67442503a46c578224bd3cb68bf682d4 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.979273 1707070 cache.go:96] cache image "registry.k8s.io/pause:3.10.1" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1" took 55.992µs
	I1124 09:25:44.979277 1707070 cache.go:80] save to tar file registry.k8s.io/pause:3.10.1 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 succeeded
	I1124 09:25:44.979285 1707070 cache.go:107] acquiring lock: {Name:mk726502cb84c177b2e14fee88512325761511c5 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.979301 1707070 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 exists
	I1124 09:25:44.979310 1707070 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 exists
	I1124 09:25:44.979308 1707070 cache.go:96] cache image "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0" took 43.274µs
	I1124 09:25:44.979314 1707070 cache.go:96] cache image "registry.k8s.io/coredns/coredns:v1.13.1" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1" took 29.982µs
	I1124 09:25:44.979319 1707070 cache.go:80] save to tar file registry.k8s.io/coredns/coredns:v1.13.1 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 succeeded
	I1124 09:25:44.979319 1707070 cache.go:80] save to tar file registry.k8s.io/kube-apiserver:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 succeeded
	I1124 09:25:44.979326 1707070 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0 exists
	I1124 09:25:44.979330 1707070 cache.go:96] cache image "registry.k8s.io/etcd:3.5.24-0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0" took 94.392µs
	I1124 09:25:44.979336 1707070 cache.go:80] save to tar file registry.k8s.io/etcd:3.5.24-0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0 succeeded
	I1124 09:25:44.979330 1707070 cache.go:107] acquiring lock: {Name:mkfdc49c8e68aee34cee0c9d441ae8a4dca675c6 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.979345 1707070 cache.go:107] acquiring lock: {Name:mkdbf38e05e2c47c1a7a906a2236e9e7020a94c5 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.979364 1707070 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 exists
	I1124 09:25:44.979370 1707070 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 exists
	I1124 09:25:44.979368 1707070 cache.go:96] cache image "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0" took 40.427µs
	I1124 09:25:44.979373 1707070 cache.go:96] cache image "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0" took 29.49µs
	I1124 09:25:44.979375 1707070 cache.go:80] save to tar file registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 succeeded
	I1124 09:25:44.979378 1707070 cache.go:80] save to tar file registry.k8s.io/kube-scheduler:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 succeeded
	I1124 09:25:44.979407 1707070 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 exists
	I1124 09:25:44.979413 1707070 cache.go:96] cache image "registry.k8s.io/kube-proxy:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0" took 225.709µs
	I1124 09:25:44.979418 1707070 cache.go:80] save to tar file registry.k8s.io/kube-proxy:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 succeeded
	I1124 09:25:44.979424 1707070 cache.go:87] Successfully saved all images to host disk.
	I1124 09:25:45.028668 1707070 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1124 09:25:45.028686 1707070 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/21978-1652607/.minikube CaCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/21978-1652607/.minikube}
	I1124 09:25:45.028706 1707070 ubuntu.go:190] setting up certificates
	I1124 09:25:45.028727 1707070 provision.go:84] configureAuth start
	I1124 09:25:45.028800 1707070 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-291288
	I1124 09:25:45.083635 1707070 provision.go:143] copyHostCerts
	I1124 09:25:45.083709 1707070 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem, removing ...
	I1124 09:25:45.083718 1707070 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem
	I1124 09:25:45.083806 1707070 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem (1679 bytes)
	I1124 09:25:45.083920 1707070 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem, removing ...
	I1124 09:25:45.083924 1707070 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem
	I1124 09:25:45.083951 1707070 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem (1078 bytes)
	I1124 09:25:45.084006 1707070 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem, removing ...
	I1124 09:25:45.084009 1707070 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem
	I1124 09:25:45.084038 1707070 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem (1123 bytes)
	I1124 09:25:45.084083 1707070 provision.go:117] generating server cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem org=jenkins.functional-291288 san=[127.0.0.1 192.168.49.2 functional-291288 localhost minikube]
	I1124 09:25:45.498574 1707070 provision.go:177] copyRemoteCerts
	I1124 09:25:45.498637 1707070 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1124 09:25:45.498677 1707070 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:25:45.520187 1707070 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:25:45.626724 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1124 09:25:45.644660 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1124 09:25:45.663269 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1124 09:25:45.681392 1707070 provision.go:87] duration metric: took 652.643227ms to configureAuth
	I1124 09:25:45.681410 1707070 ubuntu.go:206] setting minikube options for container-runtime
	I1124 09:25:45.681611 1707070 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1124 09:25:45.681617 1707070 machine.go:97] duration metric: took 1.234286229s to provisionDockerMachine
	I1124 09:25:45.681624 1707070 start.go:293] postStartSetup for "functional-291288" (driver="docker")
	I1124 09:25:45.681634 1707070 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1124 09:25:45.681687 1707070 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1124 09:25:45.681727 1707070 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:25:45.698790 1707070 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:25:45.802503 1707070 ssh_runner.go:195] Run: cat /etc/os-release
	I1124 09:25:45.805922 1707070 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1124 09:25:45.805944 1707070 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1124 09:25:45.805954 1707070 filesync.go:126] Scanning /home/jenkins/minikube-integration/21978-1652607/.minikube/addons for local assets ...
	I1124 09:25:45.806011 1707070 filesync.go:126] Scanning /home/jenkins/minikube-integration/21978-1652607/.minikube/files for local assets ...
	I1124 09:25:45.806087 1707070 filesync.go:149] local asset: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem -> 16544672.pem in /etc/ssl/certs
	I1124 09:25:45.806167 1707070 filesync.go:149] local asset: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/test/nested/copy/1654467/hosts -> hosts in /etc/test/nested/copy/1654467
	I1124 09:25:45.806257 1707070 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/1654467
	I1124 09:25:45.814093 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem --> /etc/ssl/certs/16544672.pem (1708 bytes)
	I1124 09:25:45.832308 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/test/nested/copy/1654467/hosts --> /etc/test/nested/copy/1654467/hosts (40 bytes)
	I1124 09:25:45.850625 1707070 start.go:296] duration metric: took 168.9873ms for postStartSetup
	I1124 09:25:45.850696 1707070 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1124 09:25:45.850734 1707070 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:25:45.868479 1707070 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:25:45.971382 1707070 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1124 09:25:45.976655 1707070 fix.go:56] duration metric: took 1.550948262s for fixHost
	I1124 09:25:45.976671 1707070 start.go:83] releasing machines lock for "functional-291288", held for 1.550982815s
	I1124 09:25:45.976739 1707070 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-291288
	I1124 09:25:45.997505 1707070 ssh_runner.go:195] Run: cat /version.json
	I1124 09:25:45.997527 1707070 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1124 09:25:45.997550 1707070 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:25:45.997588 1707070 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:25:46.017321 1707070 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:25:46.018732 1707070 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:25:46.118131 1707070 ssh_runner.go:195] Run: systemctl --version
	I1124 09:25:46.213854 1707070 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1124 09:25:46.218087 1707070 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1124 09:25:46.218149 1707070 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1124 09:25:46.225944 1707070 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1124 09:25:46.225958 1707070 start.go:496] detecting cgroup driver to use...
	I1124 09:25:46.225989 1707070 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1124 09:25:46.226035 1707070 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1124 09:25:46.241323 1707070 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1124 09:25:46.254720 1707070 docker.go:218] disabling cri-docker service (if available) ...
	I1124 09:25:46.254789 1707070 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1124 09:25:46.270340 1707070 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1124 09:25:46.283549 1707070 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1124 09:25:46.399926 1707070 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1124 09:25:46.515234 1707070 docker.go:234] disabling docker service ...
	I1124 09:25:46.515290 1707070 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1124 09:25:46.529899 1707070 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1124 09:25:46.543047 1707070 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1124 09:25:46.658532 1707070 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1124 09:25:46.775880 1707070 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1124 09:25:46.790551 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1124 09:25:46.806411 1707070 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:25:46.967053 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1124 09:25:46.977583 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1124 09:25:46.986552 1707070 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1124 09:25:46.986618 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1124 09:25:46.995635 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1124 09:25:47.005680 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1124 09:25:47.015425 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1124 09:25:47.024808 1707070 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1124 09:25:47.033022 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1124 09:25:47.041980 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1124 09:25:47.051362 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1124 09:25:47.060469 1707070 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1124 09:25:47.068004 1707070 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1124 09:25:47.075326 1707070 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1124 09:25:47.191217 1707070 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1124 09:25:47.313892 1707070 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1124 09:25:47.313955 1707070 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1124 09:25:47.318001 1707070 start.go:564] Will wait 60s for crictl version
	I1124 09:25:47.318060 1707070 ssh_runner.go:195] Run: which crictl
	I1124 09:25:47.321766 1707070 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1124 09:25:47.347974 1707070 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.1.5
	RuntimeApiVersion:  v1
	I1124 09:25:47.348042 1707070 ssh_runner.go:195] Run: containerd --version
	I1124 09:25:47.369074 1707070 ssh_runner.go:195] Run: containerd --version
	I1124 09:25:47.394675 1707070 out.go:179] * Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	I1124 09:25:47.397593 1707070 cli_runner.go:164] Run: docker network inspect functional-291288 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1124 09:25:47.412872 1707070 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1124 09:25:47.419437 1707070 out.go:179]   - apiserver.enable-admission-plugins=NamespaceAutoProvision
	I1124 09:25:47.422135 1707070 kubeadm.go:884] updating cluster {Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikub
eCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker Bina
ryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1124 09:25:47.422352 1707070 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:25:47.578507 1707070 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:25:47.745390 1707070 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:25:47.894887 1707070 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1124 09:25:47.894982 1707070 ssh_runner.go:195] Run: sudo crictl images --output json
	I1124 09:25:47.919585 1707070 containerd.go:627] all images are preloaded for containerd runtime.
	I1124 09:25:47.919604 1707070 cache_images.go:86] Images are preloaded, skipping loading
	I1124 09:25:47.919612 1707070 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-beta.0 containerd true true} ...
	I1124 09:25:47.919707 1707070 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-291288 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1124 09:25:47.919778 1707070 ssh_runner.go:195] Run: sudo crictl info
	I1124 09:25:47.948265 1707070 extraconfig.go:125] Overwriting default enable-admission-plugins=NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota with user provided enable-admission-plugins=NamespaceAutoProvision for component apiserver
	I1124 09:25:47.948285 1707070 cni.go:84] Creating CNI manager for ""
	I1124 09:25:47.948293 1707070 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1124 09:25:47.948308 1707070 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1124 09:25:47.948331 1707070 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-beta.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-291288 NodeName:functional-291288 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceAutoProvision] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false Kubel
etConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1124 09:25:47.948441 1707070 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-291288"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceAutoProvision"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-beta.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1124 09:25:47.948507 1707070 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0
	I1124 09:25:47.956183 1707070 binaries.go:51] Found k8s binaries, skipping transfer
	I1124 09:25:47.956246 1707070 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1124 09:25:47.963641 1707070 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (328 bytes)
	I1124 09:25:47.976586 1707070 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (359 bytes)
	I1124 09:25:47.989056 1707070 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2087 bytes)
	I1124 09:25:48.003961 1707070 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1124 09:25:48.011533 1707070 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1124 09:25:48.134407 1707070 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1124 09:25:48.383061 1707070 certs.go:69] Setting up /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288 for IP: 192.168.49.2
	I1124 09:25:48.383072 1707070 certs.go:195] generating shared ca certs ...
	I1124 09:25:48.383086 1707070 certs.go:227] acquiring lock for ca certs: {Name:mkbe540a30c4376a351176f7fe6fec044d058b09 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 09:25:48.383238 1707070 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.key
	I1124 09:25:48.383279 1707070 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.key
	I1124 09:25:48.383286 1707070 certs.go:257] generating profile certs ...
	I1124 09:25:48.383366 1707070 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.key
	I1124 09:25:48.383420 1707070 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.key.5acb2515
	I1124 09:25:48.383456 1707070 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.key
	I1124 09:25:48.383562 1707070 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467.pem (1338 bytes)
	W1124 09:25:48.383598 1707070 certs.go:480] ignoring /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467_empty.pem, impossibly tiny 0 bytes
	I1124 09:25:48.383605 1707070 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem (1671 bytes)
	I1124 09:25:48.383632 1707070 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem (1078 bytes)
	I1124 09:25:48.383655 1707070 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem (1123 bytes)
	I1124 09:25:48.383684 1707070 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem (1679 bytes)
	I1124 09:25:48.383730 1707070 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem (1708 bytes)
	I1124 09:25:48.384294 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1124 09:25:48.403533 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1124 09:25:48.421212 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1124 09:25:48.441887 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1124 09:25:48.462311 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1124 09:25:48.480889 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1124 09:25:48.499086 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1124 09:25:48.517112 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1124 09:25:48.535554 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem --> /usr/share/ca-certificates/16544672.pem (1708 bytes)
	I1124 09:25:48.553310 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1124 09:25:48.571447 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467.pem --> /usr/share/ca-certificates/1654467.pem (1338 bytes)
	I1124 09:25:48.589094 1707070 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1124 09:25:48.602393 1707070 ssh_runner.go:195] Run: openssl version
	I1124 09:25:48.608953 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/16544672.pem && ln -fs /usr/share/ca-certificates/16544672.pem /etc/ssl/certs/16544672.pem"
	I1124 09:25:48.617886 1707070 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/16544672.pem
	I1124 09:25:48.621697 1707070 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Nov 24 09:10 /usr/share/ca-certificates/16544672.pem
	I1124 09:25:48.621756 1707070 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/16544672.pem
	I1124 09:25:48.663214 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/16544672.pem /etc/ssl/certs/3ec20f2e.0"
	I1124 09:25:48.671328 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1124 09:25:48.679977 1707070 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:25:48.683961 1707070 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Nov 24 08:44 /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:25:48.684024 1707070 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:25:48.725273 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1124 09:25:48.733278 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1654467.pem && ln -fs /usr/share/ca-certificates/1654467.pem /etc/ssl/certs/1654467.pem"
	I1124 09:25:48.741887 1707070 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1654467.pem
	I1124 09:25:48.745440 1707070 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Nov 24 09:10 /usr/share/ca-certificates/1654467.pem
	I1124 09:25:48.745500 1707070 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1654467.pem
	I1124 09:25:48.791338 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1654467.pem /etc/ssl/certs/51391683.0"
	I1124 09:25:48.799503 1707070 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1124 09:25:48.803145 1707070 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1124 09:25:48.844016 1707070 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1124 09:25:48.884962 1707070 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1124 09:25:48.926044 1707070 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1124 09:25:48.967289 1707070 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1124 09:25:49.008697 1707070 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1124 09:25:49.049934 1707070 kubeadm.go:401] StartCluster: {Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikubeCA
APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryM
irror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:25:49.050012 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1124 09:25:49.050074 1707070 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1124 09:25:49.080420 1707070 cri.go:89] found id: ""
	I1124 09:25:49.080484 1707070 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1124 09:25:49.088364 1707070 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1124 09:25:49.088374 1707070 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1124 09:25:49.088425 1707070 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1124 09:25:49.095680 1707070 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1124 09:25:49.096194 1707070 kubeconfig.go:125] found "functional-291288" server: "https://192.168.49.2:8441"
	I1124 09:25:49.097500 1707070 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1124 09:25:49.105267 1707070 kubeadm.go:645] detected kubeadm config drift (will reconfigure cluster from new /var/tmp/minikube/kubeadm.yaml):
	-- stdout --
	--- /var/tmp/minikube/kubeadm.yaml	2025-11-24 09:11:10.138797725 +0000
	+++ /var/tmp/minikube/kubeadm.yaml.new	2025-11-24 09:25:47.995648074 +0000
	@@ -24,7 +24,7 @@
	   certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	   extraArgs:
	     - name: "enable-admission-plugins"
	-      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	+      value: "NamespaceAutoProvision"
	 controllerManager:
	   extraArgs:
	     - name: "allocate-node-cidrs"
	
	-- /stdout --
	I1124 09:25:49.105285 1707070 kubeadm.go:1161] stopping kube-system containers ...
	I1124 09:25:49.105296 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name: Namespaces:[kube-system]}
	I1124 09:25:49.105351 1707070 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1124 09:25:49.142256 1707070 cri.go:89] found id: ""
	I1124 09:25:49.142317 1707070 ssh_runner.go:195] Run: sudo systemctl stop kubelet
	I1124 09:25:49.162851 1707070 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1124 09:25:49.170804 1707070 kubeadm.go:158] found existing configuration files:
	-rw------- 1 root root 5635 Nov 24 09:15 /etc/kubernetes/admin.conf
	-rw------- 1 root root 5636 Nov 24 09:15 /etc/kubernetes/controller-manager.conf
	-rw------- 1 root root 5672 Nov 24 09:15 /etc/kubernetes/kubelet.conf
	-rw------- 1 root root 5584 Nov 24 09:15 /etc/kubernetes/scheduler.conf
	
	I1124 09:25:49.170876 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1124 09:25:49.178603 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1124 09:25:49.185907 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1124 09:25:49.185964 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1124 09:25:49.193453 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1124 09:25:49.200815 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1124 09:25:49.200869 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1124 09:25:49.208328 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1124 09:25:49.215968 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1124 09:25:49.216025 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1124 09:25:49.223400 1707070 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1124 09:25:49.230953 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase certs all --config /var/tmp/minikube/kubeadm.yaml"
	I1124 09:25:49.277779 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml"
	I1124 09:25:50.308934 1707070 ssh_runner.go:235] Completed: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml": (1.031131442s)
	I1124 09:25:50.308993 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubelet-start --config /var/tmp/minikube/kubeadm.yaml"
	I1124 09:25:50.511648 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase control-plane all --config /var/tmp/minikube/kubeadm.yaml"
	I1124 09:25:50.576653 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase etcd local --config /var/tmp/minikube/kubeadm.yaml"
	I1124 09:25:50.625775 1707070 api_server.go:52] waiting for apiserver process to appear ...
	I1124 09:25:50.625855 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:51.126713 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:51.625939 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:52.126677 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:52.626053 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:53.126113 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:53.626972 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:54.126493 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:54.626036 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:55.126171 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:55.626853 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:56.126041 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:56.626177 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:57.126019 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:57.626847 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:58.126017 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:58.626716 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:59.125997 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:59.626367 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:00.125951 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:00.626013 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:01.126844 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:01.626038 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:02.126420 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:02.626727 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:03.126582 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:03.626068 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:04.126304 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:04.626830 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:05.126754 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:05.625961 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:06.126197 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:06.626039 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:07.126915 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:07.626052 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:08.126281 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:08.626116 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:09.126574 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:09.626068 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:10.125978 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:10.626328 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:11.126416 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:11.626073 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:12.126027 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:12.626174 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:13.126044 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:13.626781 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:14.126849 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:14.626203 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:15.125957 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:15.626068 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:16.126934 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:16.626382 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:17.126245 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:17.626034 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:18.126745 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:18.626942 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:19.126393 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:19.626607 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:20.126050 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:20.626732 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:21.126049 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:21.626115 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:22.125988 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:22.626261 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:23.126293 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:23.626107 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:24.126971 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:24.626009 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:25.126859 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:25.626876 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:26.126041 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:26.625983 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:27.126168 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:27.626079 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:28.126047 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:28.626761 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:29.126598 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:29.626290 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:30.125941 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:30.626102 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:31.126717 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:31.626588 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:32.126223 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:32.626875 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:33.126051 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:33.625963 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:34.126808 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:34.626621 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:35.126147 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:35.626018 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:36.126039 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:36.625970 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:37.126579 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:37.626198 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:38.126718 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:38.626386 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:39.126159 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:39.626590 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:40.126050 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:40.626422 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:41.126600 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:41.626097 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:42.127732 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:42.626108 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:43.126855 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:43.626202 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:44.126380 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:44.626423 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:45.127019 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:45.626257 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:46.125911 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:46.626125 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:47.126026 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:47.626915 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:48.126322 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:48.626706 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:49.126864 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:49.627009 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:50.126375 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:50.626418 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:26:50.626521 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:26:50.654529 1707070 cri.go:89] found id: ""
	I1124 09:26:50.654543 1707070 logs.go:282] 0 containers: []
	W1124 09:26:50.654550 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:26:50.654555 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:26:50.654624 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:26:50.683038 1707070 cri.go:89] found id: ""
	I1124 09:26:50.683052 1707070 logs.go:282] 0 containers: []
	W1124 09:26:50.683059 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:26:50.683064 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:26:50.683121 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:26:50.711396 1707070 cri.go:89] found id: ""
	I1124 09:26:50.711410 1707070 logs.go:282] 0 containers: []
	W1124 09:26:50.711422 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:26:50.711433 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:26:50.711498 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:26:50.735435 1707070 cri.go:89] found id: ""
	I1124 09:26:50.735449 1707070 logs.go:282] 0 containers: []
	W1124 09:26:50.735457 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:26:50.735463 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:26:50.735520 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:26:50.760437 1707070 cri.go:89] found id: ""
	I1124 09:26:50.760451 1707070 logs.go:282] 0 containers: []
	W1124 09:26:50.760458 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:26:50.760464 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:26:50.760520 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:26:50.785555 1707070 cri.go:89] found id: ""
	I1124 09:26:50.785576 1707070 logs.go:282] 0 containers: []
	W1124 09:26:50.785584 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:26:50.785590 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:26:50.785662 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:26:50.810261 1707070 cri.go:89] found id: ""
	I1124 09:26:50.810278 1707070 logs.go:282] 0 containers: []
	W1124 09:26:50.810286 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:26:50.810294 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:26:50.810305 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:26:50.879322 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:26:50.870488   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:50.871030   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:50.872890   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:50.873352   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:50.875005   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:26:50.870488   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:50.871030   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:50.872890   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:50.873352   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:50.875005   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:26:50.879334 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:26:50.879345 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:26:50.941117 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:26:50.941140 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:26:50.969259 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:26:50.969275 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:26:51.024741 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:26:51.024763 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:26:53.542977 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:53.553083 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:26:53.553155 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:26:53.577781 1707070 cri.go:89] found id: ""
	I1124 09:26:53.577795 1707070 logs.go:282] 0 containers: []
	W1124 09:26:53.577802 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:26:53.577808 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:26:53.577866 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:26:53.604191 1707070 cri.go:89] found id: ""
	I1124 09:26:53.604205 1707070 logs.go:282] 0 containers: []
	W1124 09:26:53.604212 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:26:53.604217 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:26:53.604277 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:26:53.632984 1707070 cri.go:89] found id: ""
	I1124 09:26:53.632998 1707070 logs.go:282] 0 containers: []
	W1124 09:26:53.633004 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:26:53.633010 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:26:53.633071 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:26:53.663828 1707070 cri.go:89] found id: ""
	I1124 09:26:53.663842 1707070 logs.go:282] 0 containers: []
	W1124 09:26:53.663850 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:26:53.663856 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:26:53.663912 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:26:53.695173 1707070 cri.go:89] found id: ""
	I1124 09:26:53.695187 1707070 logs.go:282] 0 containers: []
	W1124 09:26:53.695195 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:26:53.695200 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:26:53.695259 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:26:53.719882 1707070 cri.go:89] found id: ""
	I1124 09:26:53.719897 1707070 logs.go:282] 0 containers: []
	W1124 09:26:53.719904 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:26:53.719910 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:26:53.719993 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:26:53.753006 1707070 cri.go:89] found id: ""
	I1124 09:26:53.753020 1707070 logs.go:282] 0 containers: []
	W1124 09:26:53.753038 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:26:53.753046 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:26:53.753057 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:26:53.810839 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:26:53.810864 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:26:53.828132 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:26:53.828149 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:26:53.893802 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:26:53.885327   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:53.886130   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:53.888016   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:53.888539   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:53.890056   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:26:53.885327   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:53.886130   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:53.888016   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:53.888539   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:53.890056   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:26:53.893815 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:26:53.893825 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:26:53.955840 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:26:53.955860 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:26:56.485625 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:56.495752 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:26:56.495812 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:26:56.523600 1707070 cri.go:89] found id: ""
	I1124 09:26:56.523614 1707070 logs.go:282] 0 containers: []
	W1124 09:26:56.523622 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:26:56.523627 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:26:56.523730 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:26:56.547432 1707070 cri.go:89] found id: ""
	I1124 09:26:56.547445 1707070 logs.go:282] 0 containers: []
	W1124 09:26:56.547453 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:26:56.547465 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:26:56.547522 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:26:56.571895 1707070 cri.go:89] found id: ""
	I1124 09:26:56.571909 1707070 logs.go:282] 0 containers: []
	W1124 09:26:56.571917 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:26:56.571922 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:26:56.571977 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:26:56.596624 1707070 cri.go:89] found id: ""
	I1124 09:26:56.596637 1707070 logs.go:282] 0 containers: []
	W1124 09:26:56.596644 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:26:56.596650 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:26:56.596705 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:26:56.621497 1707070 cri.go:89] found id: ""
	I1124 09:26:56.621511 1707070 logs.go:282] 0 containers: []
	W1124 09:26:56.621518 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:26:56.621523 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:26:56.621588 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:26:56.656808 1707070 cri.go:89] found id: ""
	I1124 09:26:56.656822 1707070 logs.go:282] 0 containers: []
	W1124 09:26:56.656829 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:26:56.656834 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:26:56.656891 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:26:56.693750 1707070 cri.go:89] found id: ""
	I1124 09:26:56.693763 1707070 logs.go:282] 0 containers: []
	W1124 09:26:56.693770 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:26:56.693778 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:26:56.693799 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:26:56.711624 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:26:56.711642 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:26:56.772006 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:26:56.764543   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:56.764946   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:56.766216   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:56.766780   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:56.768382   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:26:56.764543   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:56.764946   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:56.766216   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:56.766780   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:56.768382   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:26:56.772020 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:26:56.772030 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:26:56.832784 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:26:56.832805 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:26:56.862164 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:26:56.862179 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:26:59.417328 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:59.427445 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:26:59.427506 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:26:59.451539 1707070 cri.go:89] found id: ""
	I1124 09:26:59.451574 1707070 logs.go:282] 0 containers: []
	W1124 09:26:59.451582 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:26:59.451588 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:26:59.451647 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:26:59.476110 1707070 cri.go:89] found id: ""
	I1124 09:26:59.476124 1707070 logs.go:282] 0 containers: []
	W1124 09:26:59.476131 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:26:59.476137 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:26:59.476194 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:26:59.504520 1707070 cri.go:89] found id: ""
	I1124 09:26:59.504533 1707070 logs.go:282] 0 containers: []
	W1124 09:26:59.504540 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:26:59.504546 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:26:59.504607 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:26:59.529647 1707070 cri.go:89] found id: ""
	I1124 09:26:59.529662 1707070 logs.go:282] 0 containers: []
	W1124 09:26:59.529669 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:26:59.529674 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:26:59.529753 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:26:59.558904 1707070 cri.go:89] found id: ""
	I1124 09:26:59.558918 1707070 logs.go:282] 0 containers: []
	W1124 09:26:59.558925 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:26:59.558930 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:26:59.558999 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:26:59.583698 1707070 cri.go:89] found id: ""
	I1124 09:26:59.583712 1707070 logs.go:282] 0 containers: []
	W1124 09:26:59.583733 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:26:59.583738 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:26:59.583800 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:26:59.607605 1707070 cri.go:89] found id: ""
	I1124 09:26:59.607619 1707070 logs.go:282] 0 containers: []
	W1124 09:26:59.607626 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:26:59.607634 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:26:59.607645 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:26:59.624446 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:26:59.624462 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:26:59.711588 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:26:59.701837   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:59.703242   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:59.704228   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:59.706009   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:59.706513   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:26:59.701837   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:59.703242   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:59.704228   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:59.706009   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:59.706513   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:26:59.711600 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:26:59.711610 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:26:59.777617 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:26:59.777638 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:26:59.810868 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:26:59.810888 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:02.368395 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:02.379444 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:02.379503 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:02.403995 1707070 cri.go:89] found id: ""
	I1124 09:27:02.404009 1707070 logs.go:282] 0 containers: []
	W1124 09:27:02.404017 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:02.404022 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:02.404080 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:02.428532 1707070 cri.go:89] found id: ""
	I1124 09:27:02.428546 1707070 logs.go:282] 0 containers: []
	W1124 09:27:02.428553 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:02.428559 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:02.428623 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:02.455148 1707070 cri.go:89] found id: ""
	I1124 09:27:02.455162 1707070 logs.go:282] 0 containers: []
	W1124 09:27:02.455169 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:02.455174 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:02.455233 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:02.479942 1707070 cri.go:89] found id: ""
	I1124 09:27:02.479957 1707070 logs.go:282] 0 containers: []
	W1124 09:27:02.479969 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:02.479975 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:02.480034 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:02.505728 1707070 cri.go:89] found id: ""
	I1124 09:27:02.505744 1707070 logs.go:282] 0 containers: []
	W1124 09:27:02.505751 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:02.505760 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:02.505845 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:02.536863 1707070 cri.go:89] found id: ""
	I1124 09:27:02.536881 1707070 logs.go:282] 0 containers: []
	W1124 09:27:02.536889 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:02.536894 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:02.536960 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:02.566083 1707070 cri.go:89] found id: ""
	I1124 09:27:02.566107 1707070 logs.go:282] 0 containers: []
	W1124 09:27:02.566124 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:02.566132 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:02.566142 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:02.628402 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:02.628423 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:02.669505 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:02.669523 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:02.737879 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:02.737907 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:02.755317 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:02.755334 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:02.820465 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:02.811248   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:02.812608   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:02.813513   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:02.815318   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:02.815727   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:02.811248   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:02.812608   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:02.813513   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:02.815318   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:02.815727   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:05.320749 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:05.331020 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:05.331081 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:05.355889 1707070 cri.go:89] found id: ""
	I1124 09:27:05.355904 1707070 logs.go:282] 0 containers: []
	W1124 09:27:05.355912 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:05.355917 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:05.355980 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:05.381650 1707070 cri.go:89] found id: ""
	I1124 09:27:05.381664 1707070 logs.go:282] 0 containers: []
	W1124 09:27:05.381671 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:05.381676 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:05.381733 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:05.410311 1707070 cri.go:89] found id: ""
	I1124 09:27:05.410325 1707070 logs.go:282] 0 containers: []
	W1124 09:27:05.410332 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:05.410337 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:05.410396 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:05.434601 1707070 cri.go:89] found id: ""
	I1124 09:27:05.434615 1707070 logs.go:282] 0 containers: []
	W1124 09:27:05.434621 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:05.434627 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:05.434684 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:05.459196 1707070 cri.go:89] found id: ""
	I1124 09:27:05.459210 1707070 logs.go:282] 0 containers: []
	W1124 09:27:05.459218 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:05.459223 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:05.459294 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:05.483433 1707070 cri.go:89] found id: ""
	I1124 09:27:05.483448 1707070 logs.go:282] 0 containers: []
	W1124 09:27:05.483455 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:05.483460 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:05.483523 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:05.508072 1707070 cri.go:89] found id: ""
	I1124 09:27:05.508086 1707070 logs.go:282] 0 containers: []
	W1124 09:27:05.508093 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:05.508101 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:05.508111 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:05.563733 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:05.563752 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:05.584705 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:05.584736 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:05.666380 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:05.657873   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:05.658740   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:05.660432   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:05.660828   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:05.662363   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:05.657873   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:05.658740   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:05.660432   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:05.660828   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:05.662363   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:05.666394 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:05.666405 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:05.738526 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:05.738548 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:08.268404 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:08.278347 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:08.278408 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:08.303562 1707070 cri.go:89] found id: ""
	I1124 09:27:08.303577 1707070 logs.go:282] 0 containers: []
	W1124 09:27:08.303585 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:08.303590 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:08.303651 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:08.329886 1707070 cri.go:89] found id: ""
	I1124 09:27:08.329900 1707070 logs.go:282] 0 containers: []
	W1124 09:27:08.329907 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:08.329913 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:08.329971 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:08.355081 1707070 cri.go:89] found id: ""
	I1124 09:27:08.355096 1707070 logs.go:282] 0 containers: []
	W1124 09:27:08.355104 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:08.355110 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:08.355175 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:08.381511 1707070 cri.go:89] found id: ""
	I1124 09:27:08.381534 1707070 logs.go:282] 0 containers: []
	W1124 09:27:08.381543 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:08.381549 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:08.381620 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:08.410606 1707070 cri.go:89] found id: ""
	I1124 09:27:08.410629 1707070 logs.go:282] 0 containers: []
	W1124 09:27:08.410637 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:08.410642 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:08.410700 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:08.434980 1707070 cri.go:89] found id: ""
	I1124 09:27:08.434994 1707070 logs.go:282] 0 containers: []
	W1124 09:27:08.435001 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:08.435007 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:08.435064 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:08.463780 1707070 cri.go:89] found id: ""
	I1124 09:27:08.463793 1707070 logs.go:282] 0 containers: []
	W1124 09:27:08.463800 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:08.463808 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:08.463819 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:08.527201 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:08.518614   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:08.519320   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:08.521220   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:08.521832   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:08.523649   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:08.518614   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:08.519320   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:08.521220   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:08.521832   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:08.523649   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:08.527213 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:08.527223 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:08.591559 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:08.591581 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:08.619107 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:08.619125 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:08.678658 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:08.678675 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:11.199028 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:11.209463 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:11.209529 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:11.236040 1707070 cri.go:89] found id: ""
	I1124 09:27:11.236061 1707070 logs.go:282] 0 containers: []
	W1124 09:27:11.236069 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:11.236075 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:11.236145 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:11.263895 1707070 cri.go:89] found id: ""
	I1124 09:27:11.263906 1707070 logs.go:282] 0 containers: []
	W1124 09:27:11.263912 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:11.263917 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:11.263968 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:11.290492 1707070 cri.go:89] found id: ""
	I1124 09:27:11.290507 1707070 logs.go:282] 0 containers: []
	W1124 09:27:11.290514 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:11.290519 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:11.290575 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:11.316763 1707070 cri.go:89] found id: ""
	I1124 09:27:11.316778 1707070 logs.go:282] 0 containers: []
	W1124 09:27:11.316785 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:11.316791 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:11.316899 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:11.340653 1707070 cri.go:89] found id: ""
	I1124 09:27:11.340668 1707070 logs.go:282] 0 containers: []
	W1124 09:27:11.340675 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:11.340680 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:11.340741 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:11.365000 1707070 cri.go:89] found id: ""
	I1124 09:27:11.365013 1707070 logs.go:282] 0 containers: []
	W1124 09:27:11.365020 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:11.365026 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:11.365086 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:11.393012 1707070 cri.go:89] found id: ""
	I1124 09:27:11.393025 1707070 logs.go:282] 0 containers: []
	W1124 09:27:11.393033 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:11.393041 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:11.393053 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:11.409740 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:11.409758 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:11.474068 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:11.465242   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:11.466095   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:11.467959   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:11.468588   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:11.470448   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:11.465242   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:11.466095   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:11.467959   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:11.468588   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:11.470448   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:11.474079 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:11.474089 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:11.535411 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:11.535433 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:11.565626 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:11.565645 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:14.123823 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:14.133770 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:14.133829 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:14.157476 1707070 cri.go:89] found id: ""
	I1124 09:27:14.157490 1707070 logs.go:282] 0 containers: []
	W1124 09:27:14.157497 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:14.157503 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:14.157562 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:14.188747 1707070 cri.go:89] found id: ""
	I1124 09:27:14.188761 1707070 logs.go:282] 0 containers: []
	W1124 09:27:14.188768 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:14.188773 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:14.188830 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:14.216257 1707070 cri.go:89] found id: ""
	I1124 09:27:14.216271 1707070 logs.go:282] 0 containers: []
	W1124 09:27:14.216279 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:14.216284 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:14.216345 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:14.241336 1707070 cri.go:89] found id: ""
	I1124 09:27:14.241349 1707070 logs.go:282] 0 containers: []
	W1124 09:27:14.241357 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:14.241362 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:14.241423 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:14.265223 1707070 cri.go:89] found id: ""
	I1124 09:27:14.265238 1707070 logs.go:282] 0 containers: []
	W1124 09:27:14.265245 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:14.265250 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:14.265312 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:14.292087 1707070 cri.go:89] found id: ""
	I1124 09:27:14.292101 1707070 logs.go:282] 0 containers: []
	W1124 09:27:14.292108 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:14.292114 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:14.292171 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:14.316839 1707070 cri.go:89] found id: ""
	I1124 09:27:14.316854 1707070 logs.go:282] 0 containers: []
	W1124 09:27:14.316861 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:14.316869 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:14.316879 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:14.371692 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:14.371715 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:14.388964 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:14.388980 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:14.455069 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:14.447375   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:14.448018   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:14.449517   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:14.449819   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:14.451683   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:14.447375   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:14.448018   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:14.449517   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:14.449819   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:14.451683   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:14.455080 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:14.455090 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:14.518102 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:14.518124 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:17.045537 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:17.055937 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:17.056004 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:17.084357 1707070 cri.go:89] found id: ""
	I1124 09:27:17.084370 1707070 logs.go:282] 0 containers: []
	W1124 09:27:17.084378 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:17.084383 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:17.084439 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:17.112022 1707070 cri.go:89] found id: ""
	I1124 09:27:17.112035 1707070 logs.go:282] 0 containers: []
	W1124 09:27:17.112043 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:17.112048 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:17.112110 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:17.135317 1707070 cri.go:89] found id: ""
	I1124 09:27:17.135331 1707070 logs.go:282] 0 containers: []
	W1124 09:27:17.135338 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:17.135343 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:17.135399 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:17.163850 1707070 cri.go:89] found id: ""
	I1124 09:27:17.163865 1707070 logs.go:282] 0 containers: []
	W1124 09:27:17.163872 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:17.163878 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:17.163933 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:17.188915 1707070 cri.go:89] found id: ""
	I1124 09:27:17.188929 1707070 logs.go:282] 0 containers: []
	W1124 09:27:17.188936 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:17.188941 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:17.188997 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:17.217448 1707070 cri.go:89] found id: ""
	I1124 09:27:17.217461 1707070 logs.go:282] 0 containers: []
	W1124 09:27:17.217475 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:17.217480 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:17.217537 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:17.242521 1707070 cri.go:89] found id: ""
	I1124 09:27:17.242536 1707070 logs.go:282] 0 containers: []
	W1124 09:27:17.242543 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:17.242551 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:17.242561 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:17.297899 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:17.297921 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:17.315278 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:17.315297 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:17.377620 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:17.368489   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:17.368893   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:17.370596   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:17.371050   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:17.372486   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:17.368489   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:17.368893   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:17.370596   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:17.371050   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:17.372486   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:17.377640 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:17.377651 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:17.439884 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:17.439907 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:19.969337 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:19.979536 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:19.979595 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:20.018198 1707070 cri.go:89] found id: ""
	I1124 09:27:20.018220 1707070 logs.go:282] 0 containers: []
	W1124 09:27:20.018229 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:20.018235 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:20.018297 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:20.046055 1707070 cri.go:89] found id: ""
	I1124 09:27:20.046070 1707070 logs.go:282] 0 containers: []
	W1124 09:27:20.046077 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:20.046082 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:20.046158 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:20.078159 1707070 cri.go:89] found id: ""
	I1124 09:27:20.078183 1707070 logs.go:282] 0 containers: []
	W1124 09:27:20.078191 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:20.078197 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:20.078289 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:20.104136 1707070 cri.go:89] found id: ""
	I1124 09:27:20.104151 1707070 logs.go:282] 0 containers: []
	W1124 09:27:20.104158 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:20.104164 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:20.104228 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:20.130266 1707070 cri.go:89] found id: ""
	I1124 09:27:20.130280 1707070 logs.go:282] 0 containers: []
	W1124 09:27:20.130288 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:20.130293 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:20.130352 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:20.156899 1707070 cri.go:89] found id: ""
	I1124 09:27:20.156913 1707070 logs.go:282] 0 containers: []
	W1124 09:27:20.156921 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:20.156926 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:20.156986 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:20.182706 1707070 cri.go:89] found id: ""
	I1124 09:27:20.182721 1707070 logs.go:282] 0 containers: []
	W1124 09:27:20.182728 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:20.182736 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:20.182747 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:20.240720 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:20.240740 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:20.257971 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:20.257987 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:20.324806 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:20.316231   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:20.316929   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:20.317881   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:20.319464   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:20.319918   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:20.316231   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:20.316929   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:20.317881   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:20.319464   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:20.319918   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:20.324827 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:20.324838 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:20.386188 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:20.386212 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:22.915679 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:22.927190 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:22.927254 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:22.959235 1707070 cri.go:89] found id: ""
	I1124 09:27:22.959249 1707070 logs.go:282] 0 containers: []
	W1124 09:27:22.959256 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:22.959262 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:22.959318 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:22.986124 1707070 cri.go:89] found id: ""
	I1124 09:27:22.986138 1707070 logs.go:282] 0 containers: []
	W1124 09:27:22.986146 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:22.986151 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:22.986206 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:23.016094 1707070 cri.go:89] found id: ""
	I1124 09:27:23.016108 1707070 logs.go:282] 0 containers: []
	W1124 09:27:23.016116 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:23.016121 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:23.016183 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:23.044417 1707070 cri.go:89] found id: ""
	I1124 09:27:23.044431 1707070 logs.go:282] 0 containers: []
	W1124 09:27:23.044439 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:23.044444 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:23.044501 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:23.069468 1707070 cri.go:89] found id: ""
	I1124 09:27:23.069484 1707070 logs.go:282] 0 containers: []
	W1124 09:27:23.069491 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:23.069497 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:23.069556 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:23.096521 1707070 cri.go:89] found id: ""
	I1124 09:27:23.096535 1707070 logs.go:282] 0 containers: []
	W1124 09:27:23.096542 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:23.096548 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:23.096605 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:23.125327 1707070 cri.go:89] found id: ""
	I1124 09:27:23.125342 1707070 logs.go:282] 0 containers: []
	W1124 09:27:23.125349 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:23.125358 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:23.125367 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:23.180584 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:23.180605 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:23.197372 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:23.197388 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:23.259943 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:23.251679   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:23.252410   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:23.253306   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:23.254866   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:23.255334   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:23.251679   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:23.252410   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:23.253306   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:23.254866   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:23.255334   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:23.259953 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:23.259965 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:23.325045 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:23.325066 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:25.855733 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:25.866329 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:25.866395 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:25.906494 1707070 cri.go:89] found id: ""
	I1124 09:27:25.906508 1707070 logs.go:282] 0 containers: []
	W1124 09:27:25.906516 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:25.906521 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:25.906590 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:25.945205 1707070 cri.go:89] found id: ""
	I1124 09:27:25.945229 1707070 logs.go:282] 0 containers: []
	W1124 09:27:25.945237 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:25.945242 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:25.945301 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:25.970721 1707070 cri.go:89] found id: ""
	I1124 09:27:25.970736 1707070 logs.go:282] 0 containers: []
	W1124 09:27:25.970743 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:25.970749 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:25.970807 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:25.997334 1707070 cri.go:89] found id: ""
	I1124 09:27:25.997348 1707070 logs.go:282] 0 containers: []
	W1124 09:27:25.997355 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:25.997364 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:25.997438 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:26.029916 1707070 cri.go:89] found id: ""
	I1124 09:27:26.029932 1707070 logs.go:282] 0 containers: []
	W1124 09:27:26.029940 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:26.029945 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:26.030007 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:26.057466 1707070 cri.go:89] found id: ""
	I1124 09:27:26.057480 1707070 logs.go:282] 0 containers: []
	W1124 09:27:26.057488 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:26.057494 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:26.057565 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:26.083489 1707070 cri.go:89] found id: ""
	I1124 09:27:26.083503 1707070 logs.go:282] 0 containers: []
	W1124 09:27:26.083511 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:26.083519 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:26.083529 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:26.140569 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:26.140588 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:26.158554 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:26.158571 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:26.230573 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:26.222615   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:26.223218   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:26.224819   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:26.225472   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:26.226976   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:26.222615   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:26.223218   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:26.224819   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:26.225472   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:26.226976   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:26.230583 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:26.230594 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:26.292417 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:26.292436 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:28.819944 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:28.830528 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:28.830587 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:28.854228 1707070 cri.go:89] found id: ""
	I1124 09:27:28.854243 1707070 logs.go:282] 0 containers: []
	W1124 09:27:28.854250 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:28.854260 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:28.854324 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:28.891203 1707070 cri.go:89] found id: ""
	I1124 09:27:28.891217 1707070 logs.go:282] 0 containers: []
	W1124 09:27:28.891224 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:28.891230 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:28.891305 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:28.918573 1707070 cri.go:89] found id: ""
	I1124 09:27:28.918587 1707070 logs.go:282] 0 containers: []
	W1124 09:27:28.918594 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:28.918600 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:28.918665 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:28.944672 1707070 cri.go:89] found id: ""
	I1124 09:27:28.944685 1707070 logs.go:282] 0 containers: []
	W1124 09:27:28.944692 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:28.944708 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:28.944763 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:28.970414 1707070 cri.go:89] found id: ""
	I1124 09:27:28.970429 1707070 logs.go:282] 0 containers: []
	W1124 09:27:28.970436 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:28.970441 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:28.970539 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:28.995438 1707070 cri.go:89] found id: ""
	I1124 09:27:28.995453 1707070 logs.go:282] 0 containers: []
	W1124 09:27:28.995460 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:28.995466 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:28.995526 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:29.023817 1707070 cri.go:89] found id: ""
	I1124 09:27:29.023832 1707070 logs.go:282] 0 containers: []
	W1124 09:27:29.023839 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:29.023847 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:29.023858 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:29.080316 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:29.080336 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:29.097486 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:29.097502 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:29.159875 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:29.151793   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:29.152163   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:29.153608   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:29.154019   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:29.155829   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:29.151793   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:29.152163   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:29.153608   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:29.154019   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:29.155829   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:29.159888 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:29.159907 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:29.223729 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:29.223754 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:31.751641 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:31.761798 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:31.761859 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:31.788691 1707070 cri.go:89] found id: ""
	I1124 09:27:31.788705 1707070 logs.go:282] 0 containers: []
	W1124 09:27:31.788711 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:31.788717 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:31.788776 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:31.812359 1707070 cri.go:89] found id: ""
	I1124 09:27:31.812374 1707070 logs.go:282] 0 containers: []
	W1124 09:27:31.812382 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:31.812387 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:31.812450 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:31.837276 1707070 cri.go:89] found id: ""
	I1124 09:27:31.837289 1707070 logs.go:282] 0 containers: []
	W1124 09:27:31.837296 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:31.837302 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:31.837360 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:31.862818 1707070 cri.go:89] found id: ""
	I1124 09:27:31.862832 1707070 logs.go:282] 0 containers: []
	W1124 09:27:31.862840 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:31.862846 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:31.862903 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:31.904922 1707070 cri.go:89] found id: ""
	I1124 09:27:31.904936 1707070 logs.go:282] 0 containers: []
	W1124 09:27:31.904944 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:31.904950 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:31.905012 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:31.949580 1707070 cri.go:89] found id: ""
	I1124 09:27:31.949594 1707070 logs.go:282] 0 containers: []
	W1124 09:27:31.949601 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:31.949607 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:31.949661 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:31.975157 1707070 cri.go:89] found id: ""
	I1124 09:27:31.975171 1707070 logs.go:282] 0 containers: []
	W1124 09:27:31.975178 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:31.975187 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:31.975198 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:32.004216 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:32.004239 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:32.064444 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:32.064466 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:32.084210 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:32.084229 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:32.152949 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:32.144237   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:32.145124   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:32.147159   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:32.147890   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:32.148900   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:32.144237   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:32.145124   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:32.147159   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:32.147890   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:32.148900   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:32.152963 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:32.152975 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:34.714493 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:34.725033 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:34.725101 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:34.750339 1707070 cri.go:89] found id: ""
	I1124 09:27:34.750352 1707070 logs.go:282] 0 containers: []
	W1124 09:27:34.750359 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:34.750365 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:34.750422 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:34.776574 1707070 cri.go:89] found id: ""
	I1124 09:27:34.776588 1707070 logs.go:282] 0 containers: []
	W1124 09:27:34.776595 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:34.776600 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:34.776656 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:34.801274 1707070 cri.go:89] found id: ""
	I1124 09:27:34.801288 1707070 logs.go:282] 0 containers: []
	W1124 09:27:34.801295 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:34.801300 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:34.801355 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:34.828204 1707070 cri.go:89] found id: ""
	I1124 09:27:34.828217 1707070 logs.go:282] 0 containers: []
	W1124 09:27:34.828224 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:34.828230 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:34.828286 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:34.856488 1707070 cri.go:89] found id: ""
	I1124 09:27:34.856502 1707070 logs.go:282] 0 containers: []
	W1124 09:27:34.856509 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:34.856514 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:34.856571 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:34.882889 1707070 cri.go:89] found id: ""
	I1124 09:27:34.882903 1707070 logs.go:282] 0 containers: []
	W1124 09:27:34.882914 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:34.882919 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:34.882988 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:34.914562 1707070 cri.go:89] found id: ""
	I1124 09:27:34.914576 1707070 logs.go:282] 0 containers: []
	W1124 09:27:34.914583 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:34.914591 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:34.914601 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:34.981562 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:34.981596 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:34.998925 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:34.998941 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:35.070877 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:35.062206   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:35.063028   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:35.064710   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:35.065308   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:35.067060   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:35.062206   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:35.063028   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:35.064710   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:35.065308   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:35.067060   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:35.070899 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:35.070909 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:35.137172 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:35.137193 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:37.666865 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:37.677121 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:37.677182 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:37.702376 1707070 cri.go:89] found id: ""
	I1124 09:27:37.702390 1707070 logs.go:282] 0 containers: []
	W1124 09:27:37.702398 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:37.702407 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:37.702491 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:37.727342 1707070 cri.go:89] found id: ""
	I1124 09:27:37.727355 1707070 logs.go:282] 0 containers: []
	W1124 09:27:37.727363 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:37.727368 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:37.727430 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:37.753323 1707070 cri.go:89] found id: ""
	I1124 09:27:37.753336 1707070 logs.go:282] 0 containers: []
	W1124 09:27:37.753343 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:37.753349 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:37.753409 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:37.781020 1707070 cri.go:89] found id: ""
	I1124 09:27:37.781041 1707070 logs.go:282] 0 containers: []
	W1124 09:27:37.781049 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:37.781055 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:37.781117 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:37.805925 1707070 cri.go:89] found id: ""
	I1124 09:27:37.805939 1707070 logs.go:282] 0 containers: []
	W1124 09:27:37.805946 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:37.805952 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:37.806013 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:37.833036 1707070 cri.go:89] found id: ""
	I1124 09:27:37.833062 1707070 logs.go:282] 0 containers: []
	W1124 09:27:37.833069 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:37.833075 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:37.833140 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:37.860115 1707070 cri.go:89] found id: ""
	I1124 09:27:37.860129 1707070 logs.go:282] 0 containers: []
	W1124 09:27:37.860137 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:37.860145 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:37.860156 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:37.926098 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:37.926118 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:37.960030 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:37.960045 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:38.019375 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:38.019395 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:38.039066 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:38.039085 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:38.110062 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:38.101570   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:38.102692   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:38.104495   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:38.105053   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:38.106366   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:38.101570   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:38.102692   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:38.104495   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:38.105053   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:38.106366   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:40.610482 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:40.620402 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:40.620472 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:40.648289 1707070 cri.go:89] found id: ""
	I1124 09:27:40.648303 1707070 logs.go:282] 0 containers: []
	W1124 09:27:40.648311 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:40.648317 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:40.648373 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:40.672588 1707070 cri.go:89] found id: ""
	I1124 09:27:40.672603 1707070 logs.go:282] 0 containers: []
	W1124 09:27:40.672610 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:40.672616 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:40.672673 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:40.700039 1707070 cri.go:89] found id: ""
	I1124 09:27:40.700053 1707070 logs.go:282] 0 containers: []
	W1124 09:27:40.700060 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:40.700066 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:40.700129 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:40.728494 1707070 cri.go:89] found id: ""
	I1124 09:27:40.728508 1707070 logs.go:282] 0 containers: []
	W1124 09:27:40.728516 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:40.728522 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:40.728582 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:40.753773 1707070 cri.go:89] found id: ""
	I1124 09:27:40.753786 1707070 logs.go:282] 0 containers: []
	W1124 09:27:40.753793 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:40.753798 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:40.753860 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:40.778243 1707070 cri.go:89] found id: ""
	I1124 09:27:40.778257 1707070 logs.go:282] 0 containers: []
	W1124 09:27:40.778264 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:40.778270 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:40.778333 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:40.804316 1707070 cri.go:89] found id: ""
	I1124 09:27:40.804329 1707070 logs.go:282] 0 containers: []
	W1124 09:27:40.804350 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:40.804358 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:40.804370 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:40.821314 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:40.821330 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:40.901213 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:40.878028   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:40.878824   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:40.894654   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:40.895170   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:40.896920   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:40.878028   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:40.878824   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:40.894654   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:40.895170   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:40.896920   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:40.901232 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:40.901242 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:40.972785 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:40.972806 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:41.000947 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:41.000967 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:43.560416 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:43.570821 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:43.570882 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:43.595557 1707070 cri.go:89] found id: ""
	I1124 09:27:43.595571 1707070 logs.go:282] 0 containers: []
	W1124 09:27:43.595579 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:43.595585 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:43.595640 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:43.623980 1707070 cri.go:89] found id: ""
	I1124 09:27:43.623996 1707070 logs.go:282] 0 containers: []
	W1124 09:27:43.624003 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:43.624008 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:43.624074 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:43.649674 1707070 cri.go:89] found id: ""
	I1124 09:27:43.649688 1707070 logs.go:282] 0 containers: []
	W1124 09:27:43.649695 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:43.649701 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:43.649758 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:43.673375 1707070 cri.go:89] found id: ""
	I1124 09:27:43.673388 1707070 logs.go:282] 0 containers: []
	W1124 09:27:43.673397 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:43.673403 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:43.673459 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:43.700917 1707070 cri.go:89] found id: ""
	I1124 09:27:43.700931 1707070 logs.go:282] 0 containers: []
	W1124 09:27:43.700938 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:43.700943 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:43.701000 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:43.725453 1707070 cri.go:89] found id: ""
	I1124 09:27:43.725467 1707070 logs.go:282] 0 containers: []
	W1124 09:27:43.725481 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:43.725487 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:43.725557 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:43.755304 1707070 cri.go:89] found id: ""
	I1124 09:27:43.755318 1707070 logs.go:282] 0 containers: []
	W1124 09:27:43.755326 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:43.755335 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:43.755346 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:43.772549 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:43.772567 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:43.837565 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:43.829378   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:43.829969   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:43.831587   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:43.832265   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:43.833938   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:43.829378   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:43.829969   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:43.831587   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:43.832265   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:43.833938   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:43.837575 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:43.837587 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:43.898949 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:43.898969 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:43.934259 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:43.934277 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:46.497111 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:46.507177 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:46.507251 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:46.531012 1707070 cri.go:89] found id: ""
	I1124 09:27:46.531025 1707070 logs.go:282] 0 containers: []
	W1124 09:27:46.531032 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:46.531038 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:46.531101 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:46.555781 1707070 cri.go:89] found id: ""
	I1124 09:27:46.555795 1707070 logs.go:282] 0 containers: []
	W1124 09:27:46.555802 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:46.555807 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:46.555864 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:46.580956 1707070 cri.go:89] found id: ""
	I1124 09:27:46.580974 1707070 logs.go:282] 0 containers: []
	W1124 09:27:46.580982 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:46.580987 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:46.581055 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:46.606320 1707070 cri.go:89] found id: ""
	I1124 09:27:46.606333 1707070 logs.go:282] 0 containers: []
	W1124 09:27:46.606340 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:46.606346 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:46.606414 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:46.632671 1707070 cri.go:89] found id: ""
	I1124 09:27:46.632685 1707070 logs.go:282] 0 containers: []
	W1124 09:27:46.632692 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:46.632697 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:46.632755 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:46.656948 1707070 cri.go:89] found id: ""
	I1124 09:27:46.656962 1707070 logs.go:282] 0 containers: []
	W1124 09:27:46.656969 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:46.656975 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:46.657037 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:46.681897 1707070 cri.go:89] found id: ""
	I1124 09:27:46.681910 1707070 logs.go:282] 0 containers: []
	W1124 09:27:46.681917 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:46.681925 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:46.681936 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:46.698822 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:46.698839 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:46.763473 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:46.755294   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:46.755864   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:46.757448   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:46.758065   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:46.759847   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:46.755294   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:46.755864   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:46.757448   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:46.758065   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:46.759847   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:46.763499 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:46.763510 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:46.826271 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:46.826293 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:46.855001 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:46.855017 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:49.412865 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:49.423511 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:49.423574 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:49.447618 1707070 cri.go:89] found id: ""
	I1124 09:27:49.447632 1707070 logs.go:282] 0 containers: []
	W1124 09:27:49.447639 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:49.447645 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:49.447705 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:49.476127 1707070 cri.go:89] found id: ""
	I1124 09:27:49.476140 1707070 logs.go:282] 0 containers: []
	W1124 09:27:49.476147 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:49.476154 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:49.476213 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:49.501684 1707070 cri.go:89] found id: ""
	I1124 09:27:49.501697 1707070 logs.go:282] 0 containers: []
	W1124 09:27:49.501705 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:49.501711 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:49.501771 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:49.527011 1707070 cri.go:89] found id: ""
	I1124 09:27:49.527025 1707070 logs.go:282] 0 containers: []
	W1124 09:27:49.527033 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:49.527038 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:49.527098 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:49.552026 1707070 cri.go:89] found id: ""
	I1124 09:27:49.552040 1707070 logs.go:282] 0 containers: []
	W1124 09:27:49.552047 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:49.552053 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:49.552110 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:49.582162 1707070 cri.go:89] found id: ""
	I1124 09:27:49.582189 1707070 logs.go:282] 0 containers: []
	W1124 09:27:49.582196 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:49.582202 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:49.582275 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:49.612653 1707070 cri.go:89] found id: ""
	I1124 09:27:49.612667 1707070 logs.go:282] 0 containers: []
	W1124 09:27:49.612675 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:49.612683 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:49.612693 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:49.668483 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:49.668504 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:49.685463 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:49.685480 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:49.750076 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:49.741868   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:49.742309   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:49.744083   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:49.744608   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:49.746375   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:49.741868   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:49.742309   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:49.744083   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:49.744608   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:49.746375   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:49.750136 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:49.750148 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:49.811614 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:49.811634 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:52.341239 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:52.351722 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:52.351784 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:52.378388 1707070 cri.go:89] found id: ""
	I1124 09:27:52.378402 1707070 logs.go:282] 0 containers: []
	W1124 09:27:52.378410 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:52.378416 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:52.378498 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:52.404052 1707070 cri.go:89] found id: ""
	I1124 09:27:52.404067 1707070 logs.go:282] 0 containers: []
	W1124 09:27:52.404074 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:52.404079 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:52.404138 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:52.428854 1707070 cri.go:89] found id: ""
	I1124 09:27:52.428868 1707070 logs.go:282] 0 containers: []
	W1124 09:27:52.428876 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:52.428882 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:52.428945 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:52.460795 1707070 cri.go:89] found id: ""
	I1124 09:27:52.460808 1707070 logs.go:282] 0 containers: []
	W1124 09:27:52.460815 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:52.460825 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:52.460886 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:52.490351 1707070 cri.go:89] found id: ""
	I1124 09:27:52.490365 1707070 logs.go:282] 0 containers: []
	W1124 09:27:52.490372 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:52.490378 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:52.490438 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:52.515789 1707070 cri.go:89] found id: ""
	I1124 09:27:52.515804 1707070 logs.go:282] 0 containers: []
	W1124 09:27:52.515811 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:52.515816 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:52.515874 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:52.544304 1707070 cri.go:89] found id: ""
	I1124 09:27:52.544318 1707070 logs.go:282] 0 containers: []
	W1124 09:27:52.544326 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:52.544335 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:52.544347 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:52.611718 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:52.603411   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:52.604016   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:52.605628   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:52.606175   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:52.607864   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:52.603411   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:52.604016   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:52.605628   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:52.606175   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:52.607864   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:52.611731 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:52.611743 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:52.679720 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:52.679740 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:52.708422 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:52.708437 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:52.766414 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:52.766433 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:55.285861 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:55.296023 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:55.296086 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:55.324396 1707070 cri.go:89] found id: ""
	I1124 09:27:55.324409 1707070 logs.go:282] 0 containers: []
	W1124 09:27:55.324417 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:55.324422 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:55.324478 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:55.348746 1707070 cri.go:89] found id: ""
	I1124 09:27:55.348760 1707070 logs.go:282] 0 containers: []
	W1124 09:27:55.348767 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:55.348773 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:55.348832 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:55.373685 1707070 cri.go:89] found id: ""
	I1124 09:27:55.373710 1707070 logs.go:282] 0 containers: []
	W1124 09:27:55.373718 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:55.373724 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:55.373780 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:55.399757 1707070 cri.go:89] found id: ""
	I1124 09:27:55.399774 1707070 logs.go:282] 0 containers: []
	W1124 09:27:55.399783 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:55.399789 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:55.399848 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:55.424773 1707070 cri.go:89] found id: ""
	I1124 09:27:55.424788 1707070 logs.go:282] 0 containers: []
	W1124 09:27:55.424795 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:55.424800 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:55.424862 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:55.450083 1707070 cri.go:89] found id: ""
	I1124 09:27:55.450097 1707070 logs.go:282] 0 containers: []
	W1124 09:27:55.450104 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:55.450112 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:55.450170 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:55.474225 1707070 cri.go:89] found id: ""
	I1124 09:27:55.474239 1707070 logs.go:282] 0 containers: []
	W1124 09:27:55.474247 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:55.474254 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:55.474264 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:55.507455 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:55.507477 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:55.563391 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:55.563414 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:55.583115 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:55.583131 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:55.648979 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:55.641409   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:55.642033   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:55.643543   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:55.644021   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:55.645529   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:55.641409   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:55.642033   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:55.643543   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:55.644021   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:55.645529   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:55.648991 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:55.649004 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:58.210584 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:58.221285 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:58.221351 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:58.250526 1707070 cri.go:89] found id: ""
	I1124 09:27:58.250541 1707070 logs.go:282] 0 containers: []
	W1124 09:27:58.250548 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:58.250554 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:58.250612 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:58.275099 1707070 cri.go:89] found id: ""
	I1124 09:27:58.275116 1707070 logs.go:282] 0 containers: []
	W1124 09:27:58.275123 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:58.275129 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:58.275189 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:58.300058 1707070 cri.go:89] found id: ""
	I1124 09:27:58.300075 1707070 logs.go:282] 0 containers: []
	W1124 09:27:58.300082 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:58.300087 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:58.300148 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:58.323564 1707070 cri.go:89] found id: ""
	I1124 09:27:58.323578 1707070 logs.go:282] 0 containers: []
	W1124 09:27:58.323585 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:58.323591 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:58.323648 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:58.348441 1707070 cri.go:89] found id: ""
	I1124 09:27:58.348455 1707070 logs.go:282] 0 containers: []
	W1124 09:27:58.348463 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:58.348468 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:58.348527 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:58.374283 1707070 cri.go:89] found id: ""
	I1124 09:27:58.374297 1707070 logs.go:282] 0 containers: []
	W1124 09:27:58.374305 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:58.374310 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:58.374371 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:58.400624 1707070 cri.go:89] found id: ""
	I1124 09:27:58.400638 1707070 logs.go:282] 0 containers: []
	W1124 09:27:58.400645 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:58.400653 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:58.400664 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:58.457055 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:58.457075 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:58.474204 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:58.474236 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:58.538738 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:58.530985   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:58.531628   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:58.533238   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:58.533555   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:58.535049   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:58.530985   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:58.531628   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:58.533238   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:58.533555   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:58.535049   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:58.538748 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:58.538761 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:58.601043 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:58.601064 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:01.129158 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:01.152628 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:01.152709 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:01.199688 1707070 cri.go:89] found id: ""
	I1124 09:28:01.199703 1707070 logs.go:282] 0 containers: []
	W1124 09:28:01.199710 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:01.199716 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:01.199778 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:01.226293 1707070 cri.go:89] found id: ""
	I1124 09:28:01.226307 1707070 logs.go:282] 0 containers: []
	W1124 09:28:01.226314 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:01.226319 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:01.226379 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:01.252021 1707070 cri.go:89] found id: ""
	I1124 09:28:01.252036 1707070 logs.go:282] 0 containers: []
	W1124 09:28:01.252043 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:01.252049 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:01.252108 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:01.278563 1707070 cri.go:89] found id: ""
	I1124 09:28:01.278577 1707070 logs.go:282] 0 containers: []
	W1124 09:28:01.278585 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:01.278591 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:01.278697 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:01.304781 1707070 cri.go:89] found id: ""
	I1124 09:28:01.304808 1707070 logs.go:282] 0 containers: []
	W1124 09:28:01.304816 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:01.304822 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:01.304900 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:01.330549 1707070 cri.go:89] found id: ""
	I1124 09:28:01.330574 1707070 logs.go:282] 0 containers: []
	W1124 09:28:01.330581 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:01.330586 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:01.330657 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:01.355624 1707070 cri.go:89] found id: ""
	I1124 09:28:01.355646 1707070 logs.go:282] 0 containers: []
	W1124 09:28:01.355654 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:01.355661 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:01.355673 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:01.411485 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:01.411504 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:01.428912 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:01.428927 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:01.493859 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:01.485758   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:01.486490   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:01.488127   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:01.488656   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:01.490257   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:01.485758   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:01.486490   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:01.488127   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:01.488656   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:01.490257   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:01.493881 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:01.493892 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:01.554787 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:01.554808 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:04.088481 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:04.099124 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:04.099191 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:04.123836 1707070 cri.go:89] found id: ""
	I1124 09:28:04.123849 1707070 logs.go:282] 0 containers: []
	W1124 09:28:04.123857 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:04.123862 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:04.123927 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:04.159485 1707070 cri.go:89] found id: ""
	I1124 09:28:04.159499 1707070 logs.go:282] 0 containers: []
	W1124 09:28:04.159506 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:04.159511 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:04.159572 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:04.187075 1707070 cri.go:89] found id: ""
	I1124 09:28:04.187089 1707070 logs.go:282] 0 containers: []
	W1124 09:28:04.187106 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:04.187112 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:04.187169 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:04.217664 1707070 cri.go:89] found id: ""
	I1124 09:28:04.217677 1707070 logs.go:282] 0 containers: []
	W1124 09:28:04.217696 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:04.217702 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:04.217769 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:04.244060 1707070 cri.go:89] found id: ""
	I1124 09:28:04.244075 1707070 logs.go:282] 0 containers: []
	W1124 09:28:04.244082 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:04.244087 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:04.244151 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:04.269297 1707070 cri.go:89] found id: ""
	I1124 09:28:04.269311 1707070 logs.go:282] 0 containers: []
	W1124 09:28:04.269318 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:04.269323 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:04.269382 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:04.296714 1707070 cri.go:89] found id: ""
	I1124 09:28:04.296730 1707070 logs.go:282] 0 containers: []
	W1124 09:28:04.296737 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:04.296745 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:04.296760 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:04.352538 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:04.352558 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:04.370334 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:04.370357 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:04.439006 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:04.429890   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:04.430808   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:04.432656   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:04.433242   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:04.435153   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:04.429890   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:04.430808   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:04.432656   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:04.433242   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:04.435153   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:04.439018 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:04.439027 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:04.503050 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:04.503072 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:07.038611 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:07.049789 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:07.049861 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:07.074863 1707070 cri.go:89] found id: ""
	I1124 09:28:07.074878 1707070 logs.go:282] 0 containers: []
	W1124 09:28:07.074885 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:07.074893 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:07.074950 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:07.099042 1707070 cri.go:89] found id: ""
	I1124 09:28:07.099057 1707070 logs.go:282] 0 containers: []
	W1124 09:28:07.099064 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:07.099070 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:07.099131 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:07.123608 1707070 cri.go:89] found id: ""
	I1124 09:28:07.123622 1707070 logs.go:282] 0 containers: []
	W1124 09:28:07.123630 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:07.123635 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:07.123706 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:07.151391 1707070 cri.go:89] found id: ""
	I1124 09:28:07.151405 1707070 logs.go:282] 0 containers: []
	W1124 09:28:07.151412 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:07.151418 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:07.151475 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:07.182488 1707070 cri.go:89] found id: ""
	I1124 09:28:07.182502 1707070 logs.go:282] 0 containers: []
	W1124 09:28:07.182510 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:07.182515 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:07.182581 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:07.207523 1707070 cri.go:89] found id: ""
	I1124 09:28:07.207537 1707070 logs.go:282] 0 containers: []
	W1124 09:28:07.207546 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:07.207552 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:07.207614 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:07.233412 1707070 cri.go:89] found id: ""
	I1124 09:28:07.233426 1707070 logs.go:282] 0 containers: []
	W1124 09:28:07.233433 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:07.233441 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:07.233451 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:07.288900 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:07.288922 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:07.306472 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:07.306493 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:07.368097 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:07.360574   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:07.360956   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:07.362483   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:07.362820   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:07.364269   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:07.360574   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:07.360956   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:07.362483   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:07.362820   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:07.364269   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:07.368108 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:07.368121 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:07.429983 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:07.430002 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:09.965289 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:09.976378 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:09.976448 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:10.015687 1707070 cri.go:89] found id: ""
	I1124 09:28:10.015705 1707070 logs.go:282] 0 containers: []
	W1124 09:28:10.015714 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:10.015721 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:10.015811 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:10.042717 1707070 cri.go:89] found id: ""
	I1124 09:28:10.042731 1707070 logs.go:282] 0 containers: []
	W1124 09:28:10.042738 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:10.042743 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:10.042805 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:10.069226 1707070 cri.go:89] found id: ""
	I1124 09:28:10.069240 1707070 logs.go:282] 0 containers: []
	W1124 09:28:10.069259 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:10.069265 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:10.069336 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:10.094576 1707070 cri.go:89] found id: ""
	I1124 09:28:10.094591 1707070 logs.go:282] 0 containers: []
	W1124 09:28:10.094599 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:10.094604 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:10.094683 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:10.120910 1707070 cri.go:89] found id: ""
	I1124 09:28:10.120925 1707070 logs.go:282] 0 containers: []
	W1124 09:28:10.120932 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:10.120938 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:10.121007 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:10.148454 1707070 cri.go:89] found id: ""
	I1124 09:28:10.148467 1707070 logs.go:282] 0 containers: []
	W1124 09:28:10.148476 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:10.148482 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:10.148545 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:10.180342 1707070 cri.go:89] found id: ""
	I1124 09:28:10.180356 1707070 logs.go:282] 0 containers: []
	W1124 09:28:10.180363 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:10.180377 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:10.180387 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:10.237982 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:10.238001 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:10.254875 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:10.254891 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:10.315902 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:10.307876   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:10.308640   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:10.310183   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:10.310727   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:10.312228   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:10.307876   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:10.308640   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:10.310183   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:10.310727   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:10.312228   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:10.315912 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:10.315922 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:10.381257 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:10.381276 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:12.913595 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:12.923674 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:12.923734 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:12.947804 1707070 cri.go:89] found id: ""
	I1124 09:28:12.947818 1707070 logs.go:282] 0 containers: []
	W1124 09:28:12.947826 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:12.947832 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:12.947892 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:12.971923 1707070 cri.go:89] found id: ""
	I1124 09:28:12.971937 1707070 logs.go:282] 0 containers: []
	W1124 09:28:12.971944 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:12.971956 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:12.972017 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:12.996325 1707070 cri.go:89] found id: ""
	I1124 09:28:12.996339 1707070 logs.go:282] 0 containers: []
	W1124 09:28:12.996357 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:12.996364 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:12.996436 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:13.022187 1707070 cri.go:89] found id: ""
	I1124 09:28:13.022203 1707070 logs.go:282] 0 containers: []
	W1124 09:28:13.022211 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:13.022224 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:13.022296 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:13.048161 1707070 cri.go:89] found id: ""
	I1124 09:28:13.048184 1707070 logs.go:282] 0 containers: []
	W1124 09:28:13.048192 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:13.048198 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:13.048262 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:13.073539 1707070 cri.go:89] found id: ""
	I1124 09:28:13.073564 1707070 logs.go:282] 0 containers: []
	W1124 09:28:13.073571 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:13.073578 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:13.073655 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:13.098089 1707070 cri.go:89] found id: ""
	I1124 09:28:13.098106 1707070 logs.go:282] 0 containers: []
	W1124 09:28:13.098114 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:13.098122 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:13.098132 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:13.140239 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:13.140255 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:13.197847 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:13.197865 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:13.217667 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:13.217686 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:13.281312 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:13.272865   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:13.273748   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:13.275370   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:13.275717   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:13.277237   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:13.272865   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:13.273748   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:13.275370   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:13.275717   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:13.277237   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:13.281322 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:13.281334 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:15.842684 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:15.853250 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:15.853311 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:15.878981 1707070 cri.go:89] found id: ""
	I1124 09:28:15.878995 1707070 logs.go:282] 0 containers: []
	W1124 09:28:15.879030 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:15.879036 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:15.879099 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:15.904674 1707070 cri.go:89] found id: ""
	I1124 09:28:15.904687 1707070 logs.go:282] 0 containers: []
	W1124 09:28:15.904695 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:15.904700 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:15.904757 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:15.929766 1707070 cri.go:89] found id: ""
	I1124 09:28:15.929780 1707070 logs.go:282] 0 containers: []
	W1124 09:28:15.929787 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:15.929793 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:15.929851 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:15.955453 1707070 cri.go:89] found id: ""
	I1124 09:28:15.955468 1707070 logs.go:282] 0 containers: []
	W1124 09:28:15.955475 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:15.955485 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:15.955543 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:15.983839 1707070 cri.go:89] found id: ""
	I1124 09:28:15.983854 1707070 logs.go:282] 0 containers: []
	W1124 09:28:15.983861 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:15.983866 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:15.983924 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:16.014730 1707070 cri.go:89] found id: ""
	I1124 09:28:16.014744 1707070 logs.go:282] 0 containers: []
	W1124 09:28:16.014752 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:16.014757 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:16.014820 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:16.046753 1707070 cri.go:89] found id: ""
	I1124 09:28:16.046767 1707070 logs.go:282] 0 containers: []
	W1124 09:28:16.046775 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:16.046783 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:16.046794 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:16.064199 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:16.064217 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:16.139691 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:16.122247   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:16.122923   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:16.124768   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:16.125231   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:16.126838   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:16.122247   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:16.122923   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:16.124768   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:16.125231   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:16.126838   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:16.139701 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:16.139711 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:16.206802 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:16.206822 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:16.234674 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:16.234690 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:18.790282 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:18.801848 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:18.801912 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:18.827821 1707070 cri.go:89] found id: ""
	I1124 09:28:18.827836 1707070 logs.go:282] 0 containers: []
	W1124 09:28:18.827843 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:18.827849 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:18.827905 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:18.852169 1707070 cri.go:89] found id: ""
	I1124 09:28:18.852184 1707070 logs.go:282] 0 containers: []
	W1124 09:28:18.852191 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:18.852196 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:18.852253 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:18.878610 1707070 cri.go:89] found id: ""
	I1124 09:28:18.878625 1707070 logs.go:282] 0 containers: []
	W1124 09:28:18.878633 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:18.878638 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:18.878702 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:18.903384 1707070 cri.go:89] found id: ""
	I1124 09:28:18.903403 1707070 logs.go:282] 0 containers: []
	W1124 09:28:18.903410 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:18.903416 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:18.903476 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:18.928519 1707070 cri.go:89] found id: ""
	I1124 09:28:18.928534 1707070 logs.go:282] 0 containers: []
	W1124 09:28:18.928542 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:18.928547 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:18.928609 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:18.956808 1707070 cri.go:89] found id: ""
	I1124 09:28:18.956823 1707070 logs.go:282] 0 containers: []
	W1124 09:28:18.956830 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:18.956836 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:18.956893 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:18.985113 1707070 cri.go:89] found id: ""
	I1124 09:28:18.985127 1707070 logs.go:282] 0 containers: []
	W1124 09:28:18.985134 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:18.985142 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:18.985152 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:19.019130 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:19.019146 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:19.075193 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:19.075213 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:19.092291 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:19.092306 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:19.162819 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:19.154959   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:19.155361   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:19.156834   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:19.157156   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:19.158629   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:19.154959   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:19.155361   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:19.156834   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:19.157156   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:19.158629   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:19.162839 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:19.162850 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:21.737895 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:21.748053 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:21.748120 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:21.773590 1707070 cri.go:89] found id: ""
	I1124 09:28:21.773604 1707070 logs.go:282] 0 containers: []
	W1124 09:28:21.773611 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:21.773618 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:21.773679 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:21.800809 1707070 cri.go:89] found id: ""
	I1124 09:28:21.800866 1707070 logs.go:282] 0 containers: []
	W1124 09:28:21.800874 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:21.800880 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:21.800938 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:21.826581 1707070 cri.go:89] found id: ""
	I1124 09:28:21.826594 1707070 logs.go:282] 0 containers: []
	W1124 09:28:21.826602 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:21.826607 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:21.826668 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:21.856267 1707070 cri.go:89] found id: ""
	I1124 09:28:21.856282 1707070 logs.go:282] 0 containers: []
	W1124 09:28:21.856289 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:21.856295 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:21.856354 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:21.885138 1707070 cri.go:89] found id: ""
	I1124 09:28:21.885152 1707070 logs.go:282] 0 containers: []
	W1124 09:28:21.885160 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:21.885165 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:21.885224 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:21.909643 1707070 cri.go:89] found id: ""
	I1124 09:28:21.909657 1707070 logs.go:282] 0 containers: []
	W1124 09:28:21.909665 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:21.909671 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:21.909727 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:21.936792 1707070 cri.go:89] found id: ""
	I1124 09:28:21.936806 1707070 logs.go:282] 0 containers: []
	W1124 09:28:21.936813 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:21.936821 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:21.936831 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:21.993870 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:21.993890 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:22.011453 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:22.011474 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:22.078376 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:22.069998   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:22.070791   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:22.072423   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:22.073020   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:22.074616   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:22.069998   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:22.070791   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:22.072423   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:22.073020   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:22.074616   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:22.078387 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:22.078398 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:22.140934 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:22.140953 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:24.669313 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:24.679257 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:24.679328 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:24.707632 1707070 cri.go:89] found id: ""
	I1124 09:28:24.707647 1707070 logs.go:282] 0 containers: []
	W1124 09:28:24.707654 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:24.707660 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:24.707720 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:24.733688 1707070 cri.go:89] found id: ""
	I1124 09:28:24.733702 1707070 logs.go:282] 0 containers: []
	W1124 09:28:24.733710 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:24.733715 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:24.733773 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:24.759056 1707070 cri.go:89] found id: ""
	I1124 09:28:24.759071 1707070 logs.go:282] 0 containers: []
	W1124 09:28:24.759078 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:24.759084 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:24.759143 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:24.789918 1707070 cri.go:89] found id: ""
	I1124 09:28:24.789931 1707070 logs.go:282] 0 containers: []
	W1124 09:28:24.789938 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:24.789944 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:24.790003 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:24.814684 1707070 cri.go:89] found id: ""
	I1124 09:28:24.814698 1707070 logs.go:282] 0 containers: []
	W1124 09:28:24.814709 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:24.814714 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:24.814773 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:24.839467 1707070 cri.go:89] found id: ""
	I1124 09:28:24.839489 1707070 logs.go:282] 0 containers: []
	W1124 09:28:24.839497 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:24.839503 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:24.839568 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:24.863902 1707070 cri.go:89] found id: ""
	I1124 09:28:24.863917 1707070 logs.go:282] 0 containers: []
	W1124 09:28:24.863925 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:24.863933 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:24.863943 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:24.919300 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:24.919320 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:24.936150 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:24.936167 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:24.998414 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:24.990181   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:24.990882   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:24.992541   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:24.993206   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:24.994900   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:24.990181   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:24.990882   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:24.992541   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:24.993206   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:24.994900   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:24.998425 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:24.998435 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:25.062735 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:25.062756 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:27.591381 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:27.601598 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:27.601658 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:27.626062 1707070 cri.go:89] found id: ""
	I1124 09:28:27.626076 1707070 logs.go:282] 0 containers: []
	W1124 09:28:27.626084 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:27.626090 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:27.626152 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:27.654571 1707070 cri.go:89] found id: ""
	I1124 09:28:27.654591 1707070 logs.go:282] 0 containers: []
	W1124 09:28:27.654599 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:27.654604 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:27.654664 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:27.679294 1707070 cri.go:89] found id: ""
	I1124 09:28:27.679308 1707070 logs.go:282] 0 containers: []
	W1124 09:28:27.679315 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:27.679320 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:27.679377 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:27.702575 1707070 cri.go:89] found id: ""
	I1124 09:28:27.702588 1707070 logs.go:282] 0 containers: []
	W1124 09:28:27.702595 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:27.702601 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:27.702657 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:27.728251 1707070 cri.go:89] found id: ""
	I1124 09:28:27.728266 1707070 logs.go:282] 0 containers: []
	W1124 09:28:27.728273 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:27.728279 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:27.728339 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:27.752789 1707070 cri.go:89] found id: ""
	I1124 09:28:27.752802 1707070 logs.go:282] 0 containers: []
	W1124 09:28:27.752809 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:27.752815 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:27.752874 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:27.776833 1707070 cri.go:89] found id: ""
	I1124 09:28:27.776847 1707070 logs.go:282] 0 containers: []
	W1124 09:28:27.776854 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:27.776862 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:27.776871 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:27.837612 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:27.837637 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:27.866873 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:27.866890 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:27.925473 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:27.925492 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:27.942415 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:27.942432 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:28.014797 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:27.999267   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:28.000058   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:28.002028   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:28.002995   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:28.005197   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:27.999267   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:28.000058   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:28.002028   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:28.002995   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:28.005197   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:30.515707 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:30.526026 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:30.526102 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:30.550904 1707070 cri.go:89] found id: ""
	I1124 09:28:30.550918 1707070 logs.go:282] 0 containers: []
	W1124 09:28:30.550925 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:30.550931 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:30.550996 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:30.580837 1707070 cri.go:89] found id: ""
	I1124 09:28:30.580851 1707070 logs.go:282] 0 containers: []
	W1124 09:28:30.580859 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:30.580864 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:30.580920 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:30.605291 1707070 cri.go:89] found id: ""
	I1124 09:28:30.605305 1707070 logs.go:282] 0 containers: []
	W1124 09:28:30.605312 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:30.605318 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:30.605376 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:30.630158 1707070 cri.go:89] found id: ""
	I1124 09:28:30.630172 1707070 logs.go:282] 0 containers: []
	W1124 09:28:30.630181 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:30.630187 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:30.630254 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:30.653754 1707070 cri.go:89] found id: ""
	I1124 09:28:30.653772 1707070 logs.go:282] 0 containers: []
	W1124 09:28:30.653785 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:30.653790 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:30.653868 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:30.679137 1707070 cri.go:89] found id: ""
	I1124 09:28:30.679150 1707070 logs.go:282] 0 containers: []
	W1124 09:28:30.679157 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:30.679163 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:30.679221 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:30.703850 1707070 cri.go:89] found id: ""
	I1124 09:28:30.703864 1707070 logs.go:282] 0 containers: []
	W1124 09:28:30.703871 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:30.703879 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:30.703888 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:30.772547 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:30.764218   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:30.764926   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:30.766593   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:30.767134   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:30.768991   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:30.764218   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:30.764926   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:30.766593   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:30.767134   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:30.768991   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:30.772557 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:30.772568 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:30.834024 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:30.834043 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:30.862031 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:30.862046 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:30.920292 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:30.920311 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:33.438606 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:33.448762 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:33.448822 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:33.481032 1707070 cri.go:89] found id: ""
	I1124 09:28:33.481046 1707070 logs.go:282] 0 containers: []
	W1124 09:28:33.481053 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:33.481060 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:33.481117 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:33.504561 1707070 cri.go:89] found id: ""
	I1124 09:28:33.504576 1707070 logs.go:282] 0 containers: []
	W1124 09:28:33.504583 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:33.504589 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:33.504654 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:33.528885 1707070 cri.go:89] found id: ""
	I1124 09:28:33.528899 1707070 logs.go:282] 0 containers: []
	W1124 09:28:33.528906 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:33.528915 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:33.528972 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:33.553244 1707070 cri.go:89] found id: ""
	I1124 09:28:33.553258 1707070 logs.go:282] 0 containers: []
	W1124 09:28:33.553271 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:33.553277 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:33.553334 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:33.578519 1707070 cri.go:89] found id: ""
	I1124 09:28:33.578533 1707070 logs.go:282] 0 containers: []
	W1124 09:28:33.578541 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:33.578546 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:33.578607 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:33.602708 1707070 cri.go:89] found id: ""
	I1124 09:28:33.602721 1707070 logs.go:282] 0 containers: []
	W1124 09:28:33.602729 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:33.602734 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:33.602791 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:33.626894 1707070 cri.go:89] found id: ""
	I1124 09:28:33.626908 1707070 logs.go:282] 0 containers: []
	W1124 09:28:33.626916 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:33.626923 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:33.626934 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:33.684867 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:33.684887 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:33.701817 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:33.701834 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:33.775161 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:33.766757   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:33.767480   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:33.769022   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:33.769484   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:33.770951   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:33.766757   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:33.767480   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:33.769022   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:33.769484   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:33.770951   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:33.775172 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:33.775185 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:33.837667 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:33.837688 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:36.365266 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:36.376558 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:36.376622 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:36.412692 1707070 cri.go:89] found id: ""
	I1124 09:28:36.412706 1707070 logs.go:282] 0 containers: []
	W1124 09:28:36.412714 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:36.412719 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:36.412777 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:36.448943 1707070 cri.go:89] found id: ""
	I1124 09:28:36.448957 1707070 logs.go:282] 0 containers: []
	W1124 09:28:36.448964 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:36.448970 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:36.449031 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:36.474906 1707070 cri.go:89] found id: ""
	I1124 09:28:36.474920 1707070 logs.go:282] 0 containers: []
	W1124 09:28:36.474928 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:36.474934 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:36.474990 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:36.503770 1707070 cri.go:89] found id: ""
	I1124 09:28:36.503784 1707070 logs.go:282] 0 containers: []
	W1124 09:28:36.503792 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:36.503797 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:36.503863 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:36.532858 1707070 cri.go:89] found id: ""
	I1124 09:28:36.532872 1707070 logs.go:282] 0 containers: []
	W1124 09:28:36.532880 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:36.532885 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:36.532944 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:36.557874 1707070 cri.go:89] found id: ""
	I1124 09:28:36.557889 1707070 logs.go:282] 0 containers: []
	W1124 09:28:36.557896 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:36.557902 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:36.557959 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:36.582175 1707070 cri.go:89] found id: ""
	I1124 09:28:36.582189 1707070 logs.go:282] 0 containers: []
	W1124 09:28:36.582204 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:36.582212 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:36.582230 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:36.645586 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:36.637487   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:36.638140   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:36.639873   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:36.640429   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:36.641968   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:36.637487   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:36.638140   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:36.639873   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:36.640429   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:36.641968   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:36.645596 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:36.645607 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:36.708211 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:36.708231 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:36.740877 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:36.740894 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:36.798376 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:36.798396 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:39.316746 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:39.327050 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:39.327111 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:39.351416 1707070 cri.go:89] found id: ""
	I1124 09:28:39.351430 1707070 logs.go:282] 0 containers: []
	W1124 09:28:39.351438 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:39.351444 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:39.351500 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:39.375341 1707070 cri.go:89] found id: ""
	I1124 09:28:39.375355 1707070 logs.go:282] 0 containers: []
	W1124 09:28:39.375362 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:39.375367 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:39.375425 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:39.402220 1707070 cri.go:89] found id: ""
	I1124 09:28:39.402235 1707070 logs.go:282] 0 containers: []
	W1124 09:28:39.402241 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:39.402247 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:39.402306 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:39.434081 1707070 cri.go:89] found id: ""
	I1124 09:28:39.434094 1707070 logs.go:282] 0 containers: []
	W1124 09:28:39.434101 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:39.434107 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:39.434167 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:39.467514 1707070 cri.go:89] found id: ""
	I1124 09:28:39.467528 1707070 logs.go:282] 0 containers: []
	W1124 09:28:39.467535 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:39.467540 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:39.467597 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:39.500947 1707070 cri.go:89] found id: ""
	I1124 09:28:39.500961 1707070 logs.go:282] 0 containers: []
	W1124 09:28:39.500968 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:39.500974 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:39.501034 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:39.526637 1707070 cri.go:89] found id: ""
	I1124 09:28:39.526651 1707070 logs.go:282] 0 containers: []
	W1124 09:28:39.526658 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:39.526666 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:39.526676 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:39.582247 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:39.582268 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:39.599751 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:39.599767 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:39.668271 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:39.660949   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:39.661446   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:39.663149   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:39.663643   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:39.664706   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:39.660949   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:39.661446   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:39.663149   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:39.663643   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:39.664706   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:39.668281 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:39.668294 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:39.730931 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:39.730951 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:42.260305 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:42.272405 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:42.272489 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:42.300817 1707070 cri.go:89] found id: ""
	I1124 09:28:42.300842 1707070 logs.go:282] 0 containers: []
	W1124 09:28:42.300850 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:42.300856 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:42.300921 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:42.327350 1707070 cri.go:89] found id: ""
	I1124 09:28:42.327368 1707070 logs.go:282] 0 containers: []
	W1124 09:28:42.327377 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:42.327382 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:42.327441 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:42.352768 1707070 cri.go:89] found id: ""
	I1124 09:28:42.352781 1707070 logs.go:282] 0 containers: []
	W1124 09:28:42.352788 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:42.352794 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:42.352858 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:42.384996 1707070 cri.go:89] found id: ""
	I1124 09:28:42.385016 1707070 logs.go:282] 0 containers: []
	W1124 09:28:42.385024 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:42.385035 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:42.385109 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:42.433916 1707070 cri.go:89] found id: ""
	I1124 09:28:42.433942 1707070 logs.go:282] 0 containers: []
	W1124 09:28:42.433963 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:42.433974 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:42.434041 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:42.469962 1707070 cri.go:89] found id: ""
	I1124 09:28:42.469976 1707070 logs.go:282] 0 containers: []
	W1124 09:28:42.469983 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:42.469989 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:42.470045 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:42.494905 1707070 cri.go:89] found id: ""
	I1124 09:28:42.494919 1707070 logs.go:282] 0 containers: []
	W1124 09:28:42.494926 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:42.494934 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:42.494944 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:42.551276 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:42.551295 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:42.568521 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:42.568538 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:42.631652 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:42.623578   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:42.624203   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:42.625718   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:42.626134   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:42.627653   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:42.623578   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:42.624203   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:42.625718   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:42.626134   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:42.627653   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:42.631662 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:42.631689 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:42.697554 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:42.697573 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:45.228012 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:45.242540 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:45.242663 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:45.285651 1707070 cri.go:89] found id: ""
	I1124 09:28:45.285666 1707070 logs.go:282] 0 containers: []
	W1124 09:28:45.285673 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:45.285679 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:45.285747 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:45.315729 1707070 cri.go:89] found id: ""
	I1124 09:28:45.315744 1707070 logs.go:282] 0 containers: []
	W1124 09:28:45.315759 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:45.315766 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:45.315838 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:45.342027 1707070 cri.go:89] found id: ""
	I1124 09:28:45.342041 1707070 logs.go:282] 0 containers: []
	W1124 09:28:45.342048 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:45.342053 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:45.342112 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:45.368019 1707070 cri.go:89] found id: ""
	I1124 09:28:45.368033 1707070 logs.go:282] 0 containers: []
	W1124 09:28:45.368040 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:45.368046 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:45.368102 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:45.406091 1707070 cri.go:89] found id: ""
	I1124 09:28:45.406104 1707070 logs.go:282] 0 containers: []
	W1124 09:28:45.406112 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:45.406119 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:45.406176 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:45.432356 1707070 cri.go:89] found id: ""
	I1124 09:28:45.432369 1707070 logs.go:282] 0 containers: []
	W1124 09:28:45.432377 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:45.432382 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:45.432449 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:45.465291 1707070 cri.go:89] found id: ""
	I1124 09:28:45.465315 1707070 logs.go:282] 0 containers: []
	W1124 09:28:45.465324 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:45.465332 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:45.465345 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:45.527756 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:45.527784 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:45.544616 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:45.544642 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:45.606842 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:45.598345   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:45.599427   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:45.600949   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:45.601550   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:45.603105   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:45.598345   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:45.599427   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:45.600949   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:45.601550   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:45.603105   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:45.606853 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:45.606866 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:45.669056 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:45.669077 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:48.198708 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:48.210384 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:48.210449 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:48.235268 1707070 cri.go:89] found id: ""
	I1124 09:28:48.235282 1707070 logs.go:282] 0 containers: []
	W1124 09:28:48.235289 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:48.235295 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:48.235357 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:48.261413 1707070 cri.go:89] found id: ""
	I1124 09:28:48.261427 1707070 logs.go:282] 0 containers: []
	W1124 09:28:48.261434 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:48.261439 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:48.261496 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:48.291100 1707070 cri.go:89] found id: ""
	I1124 09:28:48.291114 1707070 logs.go:282] 0 containers: []
	W1124 09:28:48.291122 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:48.291127 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:48.291186 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:48.326388 1707070 cri.go:89] found id: ""
	I1124 09:28:48.326412 1707070 logs.go:282] 0 containers: []
	W1124 09:28:48.326420 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:48.326426 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:48.326499 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:48.356212 1707070 cri.go:89] found id: ""
	I1124 09:28:48.356227 1707070 logs.go:282] 0 containers: []
	W1124 09:28:48.356234 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:48.356240 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:48.356299 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:48.384677 1707070 cri.go:89] found id: ""
	I1124 09:28:48.384690 1707070 logs.go:282] 0 containers: []
	W1124 09:28:48.384697 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:48.384703 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:48.384759 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:48.422001 1707070 cri.go:89] found id: ""
	I1124 09:28:48.422015 1707070 logs.go:282] 0 containers: []
	W1124 09:28:48.422022 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:48.422030 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:48.422040 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:48.492980 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:48.493001 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:48.522367 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:48.522383 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:48.577847 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:48.577866 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:48.594803 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:48.594821 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:48.662402 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:48.654176   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:48.655485   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:48.656131   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:48.657084   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:48.658755   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:48.654176   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:48.655485   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:48.656131   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:48.657084   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:48.658755   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:51.162680 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:51.173802 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:51.173865 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:51.200124 1707070 cri.go:89] found id: ""
	I1124 09:28:51.200146 1707070 logs.go:282] 0 containers: []
	W1124 09:28:51.200155 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:51.200161 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:51.200220 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:51.225309 1707070 cri.go:89] found id: ""
	I1124 09:28:51.225323 1707070 logs.go:282] 0 containers: []
	W1124 09:28:51.225330 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:51.225335 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:51.225392 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:51.249971 1707070 cri.go:89] found id: ""
	I1124 09:28:51.249985 1707070 logs.go:282] 0 containers: []
	W1124 09:28:51.249992 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:51.249997 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:51.250053 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:51.275848 1707070 cri.go:89] found id: ""
	I1124 09:28:51.275861 1707070 logs.go:282] 0 containers: []
	W1124 09:28:51.275868 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:51.275874 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:51.275929 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:51.304356 1707070 cri.go:89] found id: ""
	I1124 09:28:51.304370 1707070 logs.go:282] 0 containers: []
	W1124 09:28:51.304386 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:51.304392 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:51.304450 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:51.329000 1707070 cri.go:89] found id: ""
	I1124 09:28:51.329015 1707070 logs.go:282] 0 containers: []
	W1124 09:28:51.329021 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:51.329027 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:51.329099 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:51.357783 1707070 cri.go:89] found id: ""
	I1124 09:28:51.357796 1707070 logs.go:282] 0 containers: []
	W1124 09:28:51.357804 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:51.357811 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:51.357820 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:51.426561 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:51.426582 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:51.456185 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:51.456202 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:51.512504 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:51.512525 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:51.530860 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:51.530877 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:51.596556 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:51.586703   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:51.587508   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:51.589233   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:51.589675   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:51.591800   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:51.586703   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:51.587508   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:51.589233   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:51.589675   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:51.591800   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:54.097448 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:54.107646 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:54.107710 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:54.131850 1707070 cri.go:89] found id: ""
	I1124 09:28:54.131869 1707070 logs.go:282] 0 containers: []
	W1124 09:28:54.131877 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:54.131883 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:54.131950 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:54.157778 1707070 cri.go:89] found id: ""
	I1124 09:28:54.157793 1707070 logs.go:282] 0 containers: []
	W1124 09:28:54.157800 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:54.157806 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:54.157871 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:54.183638 1707070 cri.go:89] found id: ""
	I1124 09:28:54.183661 1707070 logs.go:282] 0 containers: []
	W1124 09:28:54.183668 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:54.183676 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:54.183745 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:54.208654 1707070 cri.go:89] found id: ""
	I1124 09:28:54.208668 1707070 logs.go:282] 0 containers: []
	W1124 09:28:54.208675 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:54.208680 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:54.208741 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:54.237302 1707070 cri.go:89] found id: ""
	I1124 09:28:54.237317 1707070 logs.go:282] 0 containers: []
	W1124 09:28:54.237325 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:54.237331 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:54.237390 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:54.261089 1707070 cri.go:89] found id: ""
	I1124 09:28:54.261111 1707070 logs.go:282] 0 containers: []
	W1124 09:28:54.261119 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:54.261124 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:54.261195 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:54.289315 1707070 cri.go:89] found id: ""
	I1124 09:28:54.289337 1707070 logs.go:282] 0 containers: []
	W1124 09:28:54.289345 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:54.289353 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:54.289363 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:54.350840 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:54.350861 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:54.391880 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:54.391897 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:54.457044 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:54.457066 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:54.475507 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:54.475525 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:54.538358 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:54.529952   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:54.530805   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:54.531583   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:54.533115   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:54.533777   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:54.529952   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:54.530805   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:54.531583   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:54.533115   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:54.533777   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:57.040068 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:57.050642 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:57.050707 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:57.075811 1707070 cri.go:89] found id: ""
	I1124 09:28:57.075824 1707070 logs.go:282] 0 containers: []
	W1124 09:28:57.075832 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:57.075837 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:57.075899 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:57.106029 1707070 cri.go:89] found id: ""
	I1124 09:28:57.106044 1707070 logs.go:282] 0 containers: []
	W1124 09:28:57.106052 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:57.106058 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:57.106114 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:57.132742 1707070 cri.go:89] found id: ""
	I1124 09:28:57.132756 1707070 logs.go:282] 0 containers: []
	W1124 09:28:57.132763 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:57.132768 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:57.132825 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:57.156809 1707070 cri.go:89] found id: ""
	I1124 09:28:57.156823 1707070 logs.go:282] 0 containers: []
	W1124 09:28:57.156830 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:57.156835 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:57.156898 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:57.182649 1707070 cri.go:89] found id: ""
	I1124 09:28:57.182663 1707070 logs.go:282] 0 containers: []
	W1124 09:28:57.182670 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:57.182676 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:57.182733 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:57.206184 1707070 cri.go:89] found id: ""
	I1124 09:28:57.206198 1707070 logs.go:282] 0 containers: []
	W1124 09:28:57.206205 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:57.206211 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:57.206275 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:57.230629 1707070 cri.go:89] found id: ""
	I1124 09:28:57.230643 1707070 logs.go:282] 0 containers: []
	W1124 09:28:57.230651 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:57.230660 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:57.230670 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:57.287168 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:57.287187 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:57.304021 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:57.304037 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:57.368613 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:57.361126   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:57.361623   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:57.363259   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:57.363659   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:57.365140   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:57.361126   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:57.361623   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:57.363259   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:57.363659   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:57.365140   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:57.368624 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:57.368635 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:57.439834 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:57.439854 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:59.971306 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:59.982006 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:59.982066 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:00.016934 1707070 cri.go:89] found id: ""
	I1124 09:29:00.016951 1707070 logs.go:282] 0 containers: []
	W1124 09:29:00.016966 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:00.016973 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:00.017049 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:00.103638 1707070 cri.go:89] found id: ""
	I1124 09:29:00.103654 1707070 logs.go:282] 0 containers: []
	W1124 09:29:00.103663 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:00.103669 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:00.103740 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:00.170246 1707070 cri.go:89] found id: ""
	I1124 09:29:00.170264 1707070 logs.go:282] 0 containers: []
	W1124 09:29:00.170273 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:00.170280 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:00.170350 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:00.236365 1707070 cri.go:89] found id: ""
	I1124 09:29:00.236382 1707070 logs.go:282] 0 containers: []
	W1124 09:29:00.236390 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:00.236397 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:00.236474 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:00.304007 1707070 cri.go:89] found id: ""
	I1124 09:29:00.304026 1707070 logs.go:282] 0 containers: []
	W1124 09:29:00.304036 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:00.304048 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:00.304139 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:00.347892 1707070 cri.go:89] found id: ""
	I1124 09:29:00.347907 1707070 logs.go:282] 0 containers: []
	W1124 09:29:00.347916 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:00.347924 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:00.348047 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:00.392276 1707070 cri.go:89] found id: ""
	I1124 09:29:00.392292 1707070 logs.go:282] 0 containers: []
	W1124 09:29:00.392304 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:00.392314 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:00.392328 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:00.445097 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:00.445118 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:00.507903 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:00.507923 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:00.532762 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:00.532787 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:00.603329 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:00.595058   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:00.595595   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:00.597748   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:00.598425   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:00.599635   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:00.595058   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:00.595595   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:00.597748   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:00.598425   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:00.599635   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:00.603341 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:00.603352 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:03.164630 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:03.174868 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:03.174928 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:03.198952 1707070 cri.go:89] found id: ""
	I1124 09:29:03.198966 1707070 logs.go:282] 0 containers: []
	W1124 09:29:03.198973 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:03.198979 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:03.199038 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:03.228049 1707070 cri.go:89] found id: ""
	I1124 09:29:03.228063 1707070 logs.go:282] 0 containers: []
	W1124 09:29:03.228070 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:03.228075 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:03.228133 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:03.253873 1707070 cri.go:89] found id: ""
	I1124 09:29:03.253888 1707070 logs.go:282] 0 containers: []
	W1124 09:29:03.253895 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:03.253901 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:03.253969 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:03.277874 1707070 cri.go:89] found id: ""
	I1124 09:29:03.277889 1707070 logs.go:282] 0 containers: []
	W1124 09:29:03.277903 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:03.277909 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:03.277966 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:03.306311 1707070 cri.go:89] found id: ""
	I1124 09:29:03.306333 1707070 logs.go:282] 0 containers: []
	W1124 09:29:03.306340 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:03.306345 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:03.306402 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:03.330412 1707070 cri.go:89] found id: ""
	I1124 09:29:03.330425 1707070 logs.go:282] 0 containers: []
	W1124 09:29:03.330432 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:03.330438 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:03.330572 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:03.359087 1707070 cri.go:89] found id: ""
	I1124 09:29:03.359101 1707070 logs.go:282] 0 containers: []
	W1124 09:29:03.359108 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:03.359116 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:03.359125 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:03.430996 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:03.431015 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:03.467444 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:03.467460 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:03.526316 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:03.526336 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:03.543233 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:03.543250 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:03.605146 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:03.596435   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:03.597161   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:03.598917   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:03.599598   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:03.601425   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:03.596435   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:03.597161   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:03.598917   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:03.599598   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:03.601425   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:06.105406 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:06.116034 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:06.116093 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:06.140111 1707070 cri.go:89] found id: ""
	I1124 09:29:06.140125 1707070 logs.go:282] 0 containers: []
	W1124 09:29:06.140132 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:06.140137 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:06.140195 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:06.164893 1707070 cri.go:89] found id: ""
	I1124 09:29:06.164907 1707070 logs.go:282] 0 containers: []
	W1124 09:29:06.164914 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:06.164920 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:06.164979 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:06.190122 1707070 cri.go:89] found id: ""
	I1124 09:29:06.190137 1707070 logs.go:282] 0 containers: []
	W1124 09:29:06.190144 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:06.190149 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:06.190206 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:06.215548 1707070 cri.go:89] found id: ""
	I1124 09:29:06.215562 1707070 logs.go:282] 0 containers: []
	W1124 09:29:06.215569 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:06.215575 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:06.215630 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:06.239566 1707070 cri.go:89] found id: ""
	I1124 09:29:06.239592 1707070 logs.go:282] 0 containers: []
	W1124 09:29:06.239600 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:06.239605 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:06.239662 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:06.266190 1707070 cri.go:89] found id: ""
	I1124 09:29:06.266223 1707070 logs.go:282] 0 containers: []
	W1124 09:29:06.266232 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:06.266237 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:06.266301 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:06.289910 1707070 cri.go:89] found id: ""
	I1124 09:29:06.289923 1707070 logs.go:282] 0 containers: []
	W1124 09:29:06.289930 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:06.289939 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:06.289955 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:06.353044 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:06.345412   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:06.345855   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:06.347499   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:06.347885   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:06.349511   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:06.345412   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:06.345855   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:06.347499   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:06.347885   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:06.349511   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:06.353054 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:06.353068 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:06.420094 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:06.420114 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:06.452708 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:06.452724 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:06.508689 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:06.508708 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:09.026433 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:09.036862 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:09.036926 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:09.061951 1707070 cri.go:89] found id: ""
	I1124 09:29:09.061965 1707070 logs.go:282] 0 containers: []
	W1124 09:29:09.061972 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:09.061977 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:09.062035 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:09.087954 1707070 cri.go:89] found id: ""
	I1124 09:29:09.087968 1707070 logs.go:282] 0 containers: []
	W1124 09:29:09.087976 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:09.087981 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:09.088044 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:09.112784 1707070 cri.go:89] found id: ""
	I1124 09:29:09.112798 1707070 logs.go:282] 0 containers: []
	W1124 09:29:09.112805 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:09.112810 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:09.112869 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:09.137324 1707070 cri.go:89] found id: ""
	I1124 09:29:09.137339 1707070 logs.go:282] 0 containers: []
	W1124 09:29:09.137347 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:09.137353 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:09.137413 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:09.162408 1707070 cri.go:89] found id: ""
	I1124 09:29:09.162422 1707070 logs.go:282] 0 containers: []
	W1124 09:29:09.162430 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:09.162435 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:09.162513 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:09.191279 1707070 cri.go:89] found id: ""
	I1124 09:29:09.191293 1707070 logs.go:282] 0 containers: []
	W1124 09:29:09.191300 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:09.191305 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:09.191361 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:09.214616 1707070 cri.go:89] found id: ""
	I1124 09:29:09.214630 1707070 logs.go:282] 0 containers: []
	W1124 09:29:09.214637 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:09.214645 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:09.214657 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:09.270146 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:09.270164 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:09.287320 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:09.287340 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:09.352488 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:09.344015   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:09.344642   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:09.346617   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:09.347280   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:09.348952   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:09.344015   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:09.344642   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:09.346617   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:09.347280   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:09.348952   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:09.352499 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:09.352510 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:09.418511 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:09.418532 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:11.954969 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:11.967024 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:11.967089 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:11.990717 1707070 cri.go:89] found id: ""
	I1124 09:29:11.990733 1707070 logs.go:282] 0 containers: []
	W1124 09:29:11.990741 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:11.990746 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:11.990809 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:12.020399 1707070 cri.go:89] found id: ""
	I1124 09:29:12.020413 1707070 logs.go:282] 0 containers: []
	W1124 09:29:12.020421 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:12.020427 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:12.020495 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:12.047081 1707070 cri.go:89] found id: ""
	I1124 09:29:12.047105 1707070 logs.go:282] 0 containers: []
	W1124 09:29:12.047114 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:12.047120 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:12.047185 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:12.072046 1707070 cri.go:89] found id: ""
	I1124 09:29:12.072060 1707070 logs.go:282] 0 containers: []
	W1124 09:29:12.072068 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:12.072074 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:12.072131 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:12.103533 1707070 cri.go:89] found id: ""
	I1124 09:29:12.103547 1707070 logs.go:282] 0 containers: []
	W1124 09:29:12.103554 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:12.103559 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:12.103619 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:12.131885 1707070 cri.go:89] found id: ""
	I1124 09:29:12.131900 1707070 logs.go:282] 0 containers: []
	W1124 09:29:12.131908 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:12.131914 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:12.131977 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:12.156166 1707070 cri.go:89] found id: ""
	I1124 09:29:12.156180 1707070 logs.go:282] 0 containers: []
	W1124 09:29:12.156187 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:12.156195 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:12.156206 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:12.184115 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:12.184131 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:12.239534 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:12.239553 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:12.256920 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:12.256937 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:12.322513 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:12.315053   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:12.315552   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:12.317173   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:12.317659   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:12.319113   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:12.315053   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:12.315552   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:12.317173   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:12.317659   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:12.319113   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:12.322536 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:12.322546 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:14.891198 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:14.901386 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:14.901446 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:14.926318 1707070 cri.go:89] found id: ""
	I1124 09:29:14.926340 1707070 logs.go:282] 0 containers: []
	W1124 09:29:14.926347 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:14.926353 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:14.926413 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:14.955083 1707070 cri.go:89] found id: ""
	I1124 09:29:14.955097 1707070 logs.go:282] 0 containers: []
	W1124 09:29:14.955104 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:14.955110 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:14.955167 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:14.979745 1707070 cri.go:89] found id: ""
	I1124 09:29:14.979758 1707070 logs.go:282] 0 containers: []
	W1124 09:29:14.979766 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:14.979771 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:14.979829 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:15.004845 1707070 cri.go:89] found id: ""
	I1124 09:29:15.004861 1707070 logs.go:282] 0 containers: []
	W1124 09:29:15.004869 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:15.004875 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:15.004952 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:15.044211 1707070 cri.go:89] found id: ""
	I1124 09:29:15.044225 1707070 logs.go:282] 0 containers: []
	W1124 09:29:15.044237 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:15.044243 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:15.044330 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:15.075656 1707070 cri.go:89] found id: ""
	I1124 09:29:15.075669 1707070 logs.go:282] 0 containers: []
	W1124 09:29:15.075677 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:15.075682 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:15.075740 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:15.101378 1707070 cri.go:89] found id: ""
	I1124 09:29:15.101392 1707070 logs.go:282] 0 containers: []
	W1124 09:29:15.101400 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:15.101408 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:15.101418 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:15.159297 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:15.159316 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:15.176523 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:15.176541 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:15.242899 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:15.234359   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:15.235294   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:15.237104   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:15.237675   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:15.239169   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:15.234359   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:15.235294   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:15.237104   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:15.237675   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:15.239169   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:15.242909 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:15.242919 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:15.304297 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:15.304319 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:17.833530 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:17.843418 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:17.843476 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:17.867779 1707070 cri.go:89] found id: ""
	I1124 09:29:17.867793 1707070 logs.go:282] 0 containers: []
	W1124 09:29:17.867806 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:17.867811 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:17.867866 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:17.891077 1707070 cri.go:89] found id: ""
	I1124 09:29:17.891090 1707070 logs.go:282] 0 containers: []
	W1124 09:29:17.891098 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:17.891103 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:17.891187 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:17.915275 1707070 cri.go:89] found id: ""
	I1124 09:29:17.915289 1707070 logs.go:282] 0 containers: []
	W1124 09:29:17.915296 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:17.915301 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:17.915357 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:17.943098 1707070 cri.go:89] found id: ""
	I1124 09:29:17.943111 1707070 logs.go:282] 0 containers: []
	W1124 09:29:17.943119 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:17.943124 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:17.943186 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:17.968417 1707070 cri.go:89] found id: ""
	I1124 09:29:17.968430 1707070 logs.go:282] 0 containers: []
	W1124 09:29:17.968437 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:17.968443 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:17.968501 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:17.993301 1707070 cri.go:89] found id: ""
	I1124 09:29:17.993315 1707070 logs.go:282] 0 containers: []
	W1124 09:29:17.993322 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:17.993328 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:17.993385 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:18.021715 1707070 cri.go:89] found id: ""
	I1124 09:29:18.021730 1707070 logs.go:282] 0 containers: []
	W1124 09:29:18.021738 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:18.021746 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:18.021756 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:18.085324 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:18.085345 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:18.118128 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:18.118159 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:18.182148 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:18.182171 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:18.199970 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:18.199990 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:18.266928 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:18.258137   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:18.258818   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:18.260418   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:18.261036   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:18.262678   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:18.258137   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:18.258818   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:18.260418   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:18.261036   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:18.262678   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:20.768145 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:20.780890 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:20.780956 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:20.807227 1707070 cri.go:89] found id: ""
	I1124 09:29:20.807241 1707070 logs.go:282] 0 containers: []
	W1124 09:29:20.807248 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:20.807253 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:20.807317 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:20.836452 1707070 cri.go:89] found id: ""
	I1124 09:29:20.836466 1707070 logs.go:282] 0 containers: []
	W1124 09:29:20.836473 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:20.836478 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:20.836535 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:20.861534 1707070 cri.go:89] found id: ""
	I1124 09:29:20.861549 1707070 logs.go:282] 0 containers: []
	W1124 09:29:20.861556 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:20.861561 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:20.861620 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:20.890181 1707070 cri.go:89] found id: ""
	I1124 09:29:20.890196 1707070 logs.go:282] 0 containers: []
	W1124 09:29:20.890203 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:20.890209 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:20.890278 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:20.919882 1707070 cri.go:89] found id: ""
	I1124 09:29:20.919897 1707070 logs.go:282] 0 containers: []
	W1124 09:29:20.919904 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:20.919910 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:20.919973 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:20.948347 1707070 cri.go:89] found id: ""
	I1124 09:29:20.948361 1707070 logs.go:282] 0 containers: []
	W1124 09:29:20.948368 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:20.948373 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:20.948428 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:20.972834 1707070 cri.go:89] found id: ""
	I1124 09:29:20.972847 1707070 logs.go:282] 0 containers: []
	W1124 09:29:20.972855 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:20.972862 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:20.972873 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:21.029330 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:21.029350 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:21.046983 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:21.047000 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:21.112004 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:21.104171   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:21.104918   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:21.106573   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:21.107127   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:21.108653   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:21.104171   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:21.104918   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:21.106573   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:21.107127   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:21.108653   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:21.112015 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:21.112025 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:21.174850 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:21.174870 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:23.702609 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:23.712856 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:23.712939 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:23.741964 1707070 cri.go:89] found id: ""
	I1124 09:29:23.741978 1707070 logs.go:282] 0 containers: []
	W1124 09:29:23.741985 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:23.741991 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:23.742067 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:23.766952 1707070 cri.go:89] found id: ""
	I1124 09:29:23.766966 1707070 logs.go:282] 0 containers: []
	W1124 09:29:23.766972 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:23.766978 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:23.767035 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:23.790992 1707070 cri.go:89] found id: ""
	I1124 09:29:23.791005 1707070 logs.go:282] 0 containers: []
	W1124 09:29:23.791013 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:23.791018 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:23.791073 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:23.819700 1707070 cri.go:89] found id: ""
	I1124 09:29:23.819713 1707070 logs.go:282] 0 containers: []
	W1124 09:29:23.819720 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:23.819726 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:23.819786 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:23.848657 1707070 cri.go:89] found id: ""
	I1124 09:29:23.848683 1707070 logs.go:282] 0 containers: []
	W1124 09:29:23.848690 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:23.848695 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:23.848754 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:23.873546 1707070 cri.go:89] found id: ""
	I1124 09:29:23.873571 1707070 logs.go:282] 0 containers: []
	W1124 09:29:23.873578 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:23.873584 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:23.873654 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:23.899519 1707070 cri.go:89] found id: ""
	I1124 09:29:23.899533 1707070 logs.go:282] 0 containers: []
	W1124 09:29:23.899547 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:23.899556 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:23.899568 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:23.954834 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:23.954854 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:23.971662 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:23.971680 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:24.041660 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:24.033560   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:24.034352   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:24.036062   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:24.036417   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:24.038032   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:24.033560   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:24.034352   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:24.036062   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:24.036417   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:24.038032   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:24.041670 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:24.041681 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:24.105146 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:24.105168 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:26.634760 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:26.646166 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:26.646251 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:26.679257 1707070 cri.go:89] found id: ""
	I1124 09:29:26.679271 1707070 logs.go:282] 0 containers: []
	W1124 09:29:26.679279 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:26.679284 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:26.679344 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:26.710754 1707070 cri.go:89] found id: ""
	I1124 09:29:26.710768 1707070 logs.go:282] 0 containers: []
	W1124 09:29:26.710775 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:26.710782 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:26.710840 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:26.735831 1707070 cri.go:89] found id: ""
	I1124 09:29:26.735845 1707070 logs.go:282] 0 containers: []
	W1124 09:29:26.735852 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:26.735857 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:26.735926 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:26.759918 1707070 cri.go:89] found id: ""
	I1124 09:29:26.759932 1707070 logs.go:282] 0 containers: []
	W1124 09:29:26.759939 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:26.759947 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:26.760002 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:26.783806 1707070 cri.go:89] found id: ""
	I1124 09:29:26.783825 1707070 logs.go:282] 0 containers: []
	W1124 09:29:26.783832 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:26.783838 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:26.783895 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:26.809230 1707070 cri.go:89] found id: ""
	I1124 09:29:26.809244 1707070 logs.go:282] 0 containers: []
	W1124 09:29:26.809252 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:26.809266 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:26.809331 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:26.836902 1707070 cri.go:89] found id: ""
	I1124 09:29:26.836916 1707070 logs.go:282] 0 containers: []
	W1124 09:29:26.836923 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:26.836931 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:26.836942 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:26.853955 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:26.853978 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:26.916186 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:26.907929   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:26.908672   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:26.910345   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:26.910937   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:26.912681   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:26.907929   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:26.908672   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:26.910345   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:26.910937   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:26.912681   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:26.916196 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:26.916218 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:26.980050 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:26.980072 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:27.010821 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:27.010838 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:29.573482 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:29.583518 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:29.583582 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:29.608188 1707070 cri.go:89] found id: ""
	I1124 09:29:29.608202 1707070 logs.go:282] 0 containers: []
	W1124 09:29:29.608209 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:29.608214 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:29.608270 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:29.641187 1707070 cri.go:89] found id: ""
	I1124 09:29:29.641201 1707070 logs.go:282] 0 containers: []
	W1124 09:29:29.641209 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:29.641214 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:29.641282 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:29.674249 1707070 cri.go:89] found id: ""
	I1124 09:29:29.674269 1707070 logs.go:282] 0 containers: []
	W1124 09:29:29.674276 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:29.674282 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:29.674339 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:29.700355 1707070 cri.go:89] found id: ""
	I1124 09:29:29.700370 1707070 logs.go:282] 0 containers: []
	W1124 09:29:29.700377 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:29.700382 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:29.700438 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:29.729232 1707070 cri.go:89] found id: ""
	I1124 09:29:29.729246 1707070 logs.go:282] 0 containers: []
	W1124 09:29:29.729253 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:29.729257 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:29.729313 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:29.756753 1707070 cri.go:89] found id: ""
	I1124 09:29:29.756766 1707070 logs.go:282] 0 containers: []
	W1124 09:29:29.756773 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:29.756788 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:29.756849 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:29.782318 1707070 cri.go:89] found id: ""
	I1124 09:29:29.782332 1707070 logs.go:282] 0 containers: []
	W1124 09:29:29.782339 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:29.782347 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:29.782358 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:29.837944 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:29.837963 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:29.855075 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:29.855094 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:29.916212 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:29.907972   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:29.908745   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:29.910447   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:29.910972   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:29.912670   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:29.907972   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:29.908745   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:29.910447   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:29.910972   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:29.912670   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:29.916221 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:29.916232 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:29.978681 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:29.978703 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:32.530833 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:32.541146 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:32.541251 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:32.566525 1707070 cri.go:89] found id: ""
	I1124 09:29:32.566540 1707070 logs.go:282] 0 containers: []
	W1124 09:29:32.566548 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:32.566554 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:32.566622 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:32.591741 1707070 cri.go:89] found id: ""
	I1124 09:29:32.591756 1707070 logs.go:282] 0 containers: []
	W1124 09:29:32.591763 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:32.591768 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:32.591826 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:32.617127 1707070 cri.go:89] found id: ""
	I1124 09:29:32.617141 1707070 logs.go:282] 0 containers: []
	W1124 09:29:32.617148 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:32.617153 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:32.617209 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:32.654493 1707070 cri.go:89] found id: ""
	I1124 09:29:32.654507 1707070 logs.go:282] 0 containers: []
	W1124 09:29:32.654515 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:32.654521 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:32.654580 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:32.685080 1707070 cri.go:89] found id: ""
	I1124 09:29:32.685094 1707070 logs.go:282] 0 containers: []
	W1124 09:29:32.685101 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:32.685106 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:32.685180 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:32.715751 1707070 cri.go:89] found id: ""
	I1124 09:29:32.715766 1707070 logs.go:282] 0 containers: []
	W1124 09:29:32.715782 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:32.715788 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:32.715850 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:32.742395 1707070 cri.go:89] found id: ""
	I1124 09:29:32.742409 1707070 logs.go:282] 0 containers: []
	W1124 09:29:32.742416 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:32.742424 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:32.742434 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:32.760261 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:32.760278 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:32.828736 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:32.819577   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:32.820328   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:32.822013   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:32.822622   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:32.824506   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:32.819577   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:32.820328   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:32.822013   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:32.822622   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:32.824506   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:32.828746 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:32.828759 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:32.896940 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:32.896965 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:32.928695 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:32.928711 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:35.485941 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:35.496873 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:35.496934 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:35.525748 1707070 cri.go:89] found id: ""
	I1124 09:29:35.525782 1707070 logs.go:282] 0 containers: []
	W1124 09:29:35.525791 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:35.525796 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:35.525866 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:35.553111 1707070 cri.go:89] found id: ""
	I1124 09:29:35.553126 1707070 logs.go:282] 0 containers: []
	W1124 09:29:35.553134 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:35.553142 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:35.553220 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:35.578594 1707070 cri.go:89] found id: ""
	I1124 09:29:35.578622 1707070 logs.go:282] 0 containers: []
	W1124 09:29:35.578629 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:35.578635 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:35.578706 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:35.607322 1707070 cri.go:89] found id: ""
	I1124 09:29:35.607336 1707070 logs.go:282] 0 containers: []
	W1124 09:29:35.607343 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:35.607348 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:35.607417 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:35.638865 1707070 cri.go:89] found id: ""
	I1124 09:29:35.638880 1707070 logs.go:282] 0 containers: []
	W1124 09:29:35.638887 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:35.638893 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:35.638960 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:35.672327 1707070 cri.go:89] found id: ""
	I1124 09:29:35.672352 1707070 logs.go:282] 0 containers: []
	W1124 09:29:35.672360 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:35.672365 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:35.672431 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:35.700255 1707070 cri.go:89] found id: ""
	I1124 09:29:35.700269 1707070 logs.go:282] 0 containers: []
	W1124 09:29:35.700277 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:35.700285 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:35.700297 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:35.758017 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:35.758037 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:35.775326 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:35.775344 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:35.842090 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:35.833688   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:35.834400   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:35.836148   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:35.836802   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:35.838521   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:35.833688   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:35.834400   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:35.836148   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:35.836802   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:35.838521   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:35.842100 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:35.842120 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:35.908742 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:35.908769 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:38.443689 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:38.453968 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:38.454035 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:38.477762 1707070 cri.go:89] found id: ""
	I1124 09:29:38.477776 1707070 logs.go:282] 0 containers: []
	W1124 09:29:38.477783 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:38.477789 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:38.477853 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:38.506120 1707070 cri.go:89] found id: ""
	I1124 09:29:38.506134 1707070 logs.go:282] 0 containers: []
	W1124 09:29:38.506141 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:38.506147 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:38.506203 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:38.530669 1707070 cri.go:89] found id: ""
	I1124 09:29:38.530691 1707070 logs.go:282] 0 containers: []
	W1124 09:29:38.530699 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:38.530705 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:38.530763 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:38.560535 1707070 cri.go:89] found id: ""
	I1124 09:29:38.560558 1707070 logs.go:282] 0 containers: []
	W1124 09:29:38.560565 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:38.560572 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:38.560631 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:38.586535 1707070 cri.go:89] found id: ""
	I1124 09:29:38.586549 1707070 logs.go:282] 0 containers: []
	W1124 09:29:38.586556 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:38.586561 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:38.586620 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:38.611101 1707070 cri.go:89] found id: ""
	I1124 09:29:38.611115 1707070 logs.go:282] 0 containers: []
	W1124 09:29:38.611122 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:38.611127 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:38.611186 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:38.643467 1707070 cri.go:89] found id: ""
	I1124 09:29:38.643482 1707070 logs.go:282] 0 containers: []
	W1124 09:29:38.643489 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:38.643497 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:38.643508 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:38.708197 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:38.708218 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:38.725978 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:38.725995 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:38.789806 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:38.781672   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:38.782397   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:38.783993   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:38.784577   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:38.786135   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:38.781672   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:38.782397   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:38.783993   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:38.784577   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:38.786135   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:38.789818 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:38.789828 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:38.853085 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:38.853106 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:41.387044 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:41.398117 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:41.398183 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:41.424537 1707070 cri.go:89] found id: ""
	I1124 09:29:41.424551 1707070 logs.go:282] 0 containers: []
	W1124 09:29:41.424558 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:41.424564 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:41.424626 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:41.454716 1707070 cri.go:89] found id: ""
	I1124 09:29:41.454730 1707070 logs.go:282] 0 containers: []
	W1124 09:29:41.454737 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:41.454742 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:41.454801 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:41.479954 1707070 cri.go:89] found id: ""
	I1124 09:29:41.479969 1707070 logs.go:282] 0 containers: []
	W1124 09:29:41.479976 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:41.479981 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:41.480041 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:41.505560 1707070 cri.go:89] found id: ""
	I1124 09:29:41.505575 1707070 logs.go:282] 0 containers: []
	W1124 09:29:41.505582 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:41.505593 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:41.505654 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:41.530996 1707070 cri.go:89] found id: ""
	I1124 09:29:41.531010 1707070 logs.go:282] 0 containers: []
	W1124 09:29:41.531018 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:41.531024 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:41.531090 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:41.557489 1707070 cri.go:89] found id: ""
	I1124 09:29:41.557502 1707070 logs.go:282] 0 containers: []
	W1124 09:29:41.557510 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:41.557516 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:41.557575 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:41.587178 1707070 cri.go:89] found id: ""
	I1124 09:29:41.587192 1707070 logs.go:282] 0 containers: []
	W1124 09:29:41.587199 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:41.587207 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:41.587217 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:41.644853 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:41.644873 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:41.664905 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:41.664924 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:41.731530 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:41.723947   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:41.724430   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:41.726128   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:41.726440   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:41.727892   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:41.723947   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:41.724430   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:41.726128   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:41.726440   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:41.727892   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:41.731540 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:41.731550 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:41.793965 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:41.793985 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:44.323959 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:44.334291 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:44.334352 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:44.364183 1707070 cri.go:89] found id: ""
	I1124 09:29:44.364199 1707070 logs.go:282] 0 containers: []
	W1124 09:29:44.364206 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:44.364212 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:44.364285 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:44.391116 1707070 cri.go:89] found id: ""
	I1124 09:29:44.391130 1707070 logs.go:282] 0 containers: []
	W1124 09:29:44.391137 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:44.391142 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:44.391199 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:44.416448 1707070 cri.go:89] found id: ""
	I1124 09:29:44.416462 1707070 logs.go:282] 0 containers: []
	W1124 09:29:44.416470 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:44.416476 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:44.416533 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:44.442027 1707070 cri.go:89] found id: ""
	I1124 09:29:44.442042 1707070 logs.go:282] 0 containers: []
	W1124 09:29:44.442059 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:44.442065 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:44.442124 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:44.467492 1707070 cri.go:89] found id: ""
	I1124 09:29:44.467516 1707070 logs.go:282] 0 containers: []
	W1124 09:29:44.467525 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:44.467531 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:44.467643 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:44.492900 1707070 cri.go:89] found id: ""
	I1124 09:29:44.492914 1707070 logs.go:282] 0 containers: []
	W1124 09:29:44.492921 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:44.492927 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:44.492986 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:44.518419 1707070 cri.go:89] found id: ""
	I1124 09:29:44.518434 1707070 logs.go:282] 0 containers: []
	W1124 09:29:44.518441 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:44.518449 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:44.518479 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:44.584407 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:44.584427 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:44.616287 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:44.616305 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:44.680013 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:44.680033 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:44.702644 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:44.702662 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:44.770803 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:44.761924   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:44.762682   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:44.764417   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:44.765036   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:44.766673   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:44.761924   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:44.762682   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:44.764417   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:44.765036   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:44.766673   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:47.271699 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:47.283580 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:47.283646 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:47.309341 1707070 cri.go:89] found id: ""
	I1124 09:29:47.309355 1707070 logs.go:282] 0 containers: []
	W1124 09:29:47.309368 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:47.309385 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:47.309443 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:47.335187 1707070 cri.go:89] found id: ""
	I1124 09:29:47.335202 1707070 logs.go:282] 0 containers: []
	W1124 09:29:47.335209 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:47.335214 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:47.335273 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:47.362876 1707070 cri.go:89] found id: ""
	I1124 09:29:47.362891 1707070 logs.go:282] 0 containers: []
	W1124 09:29:47.362898 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:47.362904 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:47.362964 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:47.388290 1707070 cri.go:89] found id: ""
	I1124 09:29:47.388304 1707070 logs.go:282] 0 containers: []
	W1124 09:29:47.388311 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:47.388317 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:47.388374 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:47.416544 1707070 cri.go:89] found id: ""
	I1124 09:29:47.416558 1707070 logs.go:282] 0 containers: []
	W1124 09:29:47.416565 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:47.416570 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:47.416629 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:47.441861 1707070 cri.go:89] found id: ""
	I1124 09:29:47.441875 1707070 logs.go:282] 0 containers: []
	W1124 09:29:47.441902 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:47.441909 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:47.441978 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:47.465857 1707070 cri.go:89] found id: ""
	I1124 09:29:47.465879 1707070 logs.go:282] 0 containers: []
	W1124 09:29:47.465886 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:47.465894 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:47.465905 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:47.523429 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:47.523450 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:47.540445 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:47.540462 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:47.607683 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:47.599524   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:47.600165   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:47.601865   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:47.602402   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:47.603965   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:47.599524   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:47.600165   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:47.601865   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:47.602402   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:47.603965   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:47.607694 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:47.607704 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:47.682000 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:47.682023 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:50.218599 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:50.229182 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:50.229254 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:50.254129 1707070 cri.go:89] found id: ""
	I1124 09:29:50.254143 1707070 logs.go:282] 0 containers: []
	W1124 09:29:50.254150 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:50.254155 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:50.254219 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:50.280233 1707070 cri.go:89] found id: ""
	I1124 09:29:50.280247 1707070 logs.go:282] 0 containers: []
	W1124 09:29:50.280254 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:50.280260 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:50.280317 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:50.304403 1707070 cri.go:89] found id: ""
	I1124 09:29:50.304417 1707070 logs.go:282] 0 containers: []
	W1124 09:29:50.304424 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:50.304430 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:50.304492 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:50.329881 1707070 cri.go:89] found id: ""
	I1124 09:29:50.329897 1707070 logs.go:282] 0 containers: []
	W1124 09:29:50.329904 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:50.329910 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:50.329987 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:50.358124 1707070 cri.go:89] found id: ""
	I1124 09:29:50.358139 1707070 logs.go:282] 0 containers: []
	W1124 09:29:50.358149 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:50.358158 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:50.358246 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:50.384151 1707070 cri.go:89] found id: ""
	I1124 09:29:50.384165 1707070 logs.go:282] 0 containers: []
	W1124 09:29:50.384178 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:50.384196 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:50.384254 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:50.408884 1707070 cri.go:89] found id: ""
	I1124 09:29:50.408899 1707070 logs.go:282] 0 containers: []
	W1124 09:29:50.408906 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:50.408914 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:50.408925 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:50.464122 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:50.464147 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:50.480720 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:50.480736 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:50.544337 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:50.536334   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:50.536956   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:50.538555   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:50.539042   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:50.540634   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:50.536334   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:50.536956   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:50.538555   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:50.539042   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:50.540634   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:50.544348 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:50.544361 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:50.606972 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:50.606993 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:53.143446 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:53.154359 1707070 kubeadm.go:602] duration metric: took 4m4.065975367s to restartPrimaryControlPlane
	W1124 09:29:53.154423 1707070 out.go:285] ! Unable to restart control-plane node(s), will reset cluster: <no value>
	I1124 09:29:53.154529 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1124 09:29:53.563147 1707070 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1124 09:29:53.576942 1707070 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1124 09:29:53.584698 1707070 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1124 09:29:53.584758 1707070 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1124 09:29:53.592605 1707070 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1124 09:29:53.592613 1707070 kubeadm.go:158] found existing configuration files:
	
	I1124 09:29:53.592678 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1124 09:29:53.600460 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1124 09:29:53.600517 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1124 09:29:53.607615 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1124 09:29:53.615236 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1124 09:29:53.615293 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1124 09:29:53.622532 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1124 09:29:53.630501 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1124 09:29:53.630562 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1124 09:29:53.638386 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1124 09:29:53.646257 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1124 09:29:53.646321 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1124 09:29:53.653836 1707070 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1124 09:29:53.692708 1707070 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-beta.0
	I1124 09:29:53.692756 1707070 kubeadm.go:319] [preflight] Running pre-flight checks
	I1124 09:29:53.765347 1707070 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1124 09:29:53.765413 1707070 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1124 09:29:53.765447 1707070 kubeadm.go:319] OS: Linux
	I1124 09:29:53.765490 1707070 kubeadm.go:319] CGROUPS_CPU: enabled
	I1124 09:29:53.765537 1707070 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1124 09:29:53.765589 1707070 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1124 09:29:53.765636 1707070 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1124 09:29:53.765682 1707070 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1124 09:29:53.765729 1707070 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1124 09:29:53.765772 1707070 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1124 09:29:53.765819 1707070 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1124 09:29:53.765864 1707070 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1124 09:29:53.828877 1707070 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1124 09:29:53.829001 1707070 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1124 09:29:53.829104 1707070 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1124 09:29:53.834791 1707070 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1124 09:29:53.838245 1707070 out.go:252]   - Generating certificates and keys ...
	I1124 09:29:53.838369 1707070 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1124 09:29:53.838434 1707070 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1124 09:29:53.838527 1707070 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1124 09:29:53.838616 1707070 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1124 09:29:53.838701 1707070 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1124 09:29:53.838784 1707070 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1124 09:29:53.838854 1707070 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1124 09:29:53.838919 1707070 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1124 09:29:53.839002 1707070 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1124 09:29:53.839386 1707070 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1124 09:29:53.839639 1707070 kubeadm.go:319] [certs] Using the existing "sa" key
	I1124 09:29:53.839706 1707070 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1124 09:29:54.545063 1707070 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1124 09:29:55.036514 1707070 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1124 09:29:55.148786 1707070 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1124 09:29:55.311399 1707070 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1124 09:29:55.656188 1707070 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1124 09:29:55.656996 1707070 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1124 09:29:55.659590 1707070 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1124 09:29:55.662658 1707070 out.go:252]   - Booting up control plane ...
	I1124 09:29:55.662786 1707070 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1124 09:29:55.662870 1707070 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1124 09:29:55.664747 1707070 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1124 09:29:55.686536 1707070 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1124 09:29:55.686657 1707070 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1124 09:29:55.694440 1707070 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1124 09:29:55.694885 1707070 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1124 09:29:55.694934 1707070 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1124 09:29:55.830944 1707070 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1124 09:29:55.831051 1707070 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1124 09:33:55.829210 1707070 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000251849s
	I1124 09:33:55.829235 1707070 kubeadm.go:319] 
	I1124 09:33:55.829291 1707070 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1124 09:33:55.829323 1707070 kubeadm.go:319] 	- The kubelet is not running
	I1124 09:33:55.829428 1707070 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1124 09:33:55.829432 1707070 kubeadm.go:319] 
	I1124 09:33:55.829536 1707070 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1124 09:33:55.829573 1707070 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1124 09:33:55.829603 1707070 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1124 09:33:55.829606 1707070 kubeadm.go:319] 
	I1124 09:33:55.833661 1707070 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1124 09:33:55.834099 1707070 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1124 09:33:55.834220 1707070 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1124 09:33:55.834508 1707070 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1124 09:33:55.834517 1707070 kubeadm.go:319] 
	I1124 09:33:55.834670 1707070 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	W1124 09:33:55.834735 1707070 out.go:285] ! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000251849s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	I1124 09:33:55.834825 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1124 09:33:56.243415 1707070 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1124 09:33:56.256462 1707070 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1124 09:33:56.256517 1707070 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1124 09:33:56.264387 1707070 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1124 09:33:56.264397 1707070 kubeadm.go:158] found existing configuration files:
	
	I1124 09:33:56.264448 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1124 09:33:56.272152 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1124 09:33:56.272210 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1124 09:33:56.279938 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1124 09:33:56.287667 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1124 09:33:56.287720 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1124 09:33:56.295096 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1124 09:33:56.302699 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1124 09:33:56.302758 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1124 09:33:56.310421 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1124 09:33:56.318128 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1124 09:33:56.318183 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1124 09:33:56.325438 1707070 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1124 09:33:56.364513 1707070 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-beta.0
	I1124 09:33:56.364563 1707070 kubeadm.go:319] [preflight] Running pre-flight checks
	I1124 09:33:56.440273 1707070 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1124 09:33:56.440340 1707070 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1124 09:33:56.440376 1707070 kubeadm.go:319] OS: Linux
	I1124 09:33:56.440420 1707070 kubeadm.go:319] CGROUPS_CPU: enabled
	I1124 09:33:56.440467 1707070 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1124 09:33:56.440513 1707070 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1124 09:33:56.440560 1707070 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1124 09:33:56.440606 1707070 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1124 09:33:56.440654 1707070 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1124 09:33:56.440697 1707070 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1124 09:33:56.440749 1707070 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1124 09:33:56.440794 1707070 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1124 09:33:56.504487 1707070 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1124 09:33:56.504590 1707070 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1124 09:33:56.504685 1707070 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1124 09:33:56.510220 1707070 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1124 09:33:56.513847 1707070 out.go:252]   - Generating certificates and keys ...
	I1124 09:33:56.513936 1707070 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1124 09:33:56.514003 1707070 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1124 09:33:56.514078 1707070 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1124 09:33:56.514137 1707070 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1124 09:33:56.514205 1707070 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1124 09:33:56.514264 1707070 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1124 09:33:56.514326 1707070 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1124 09:33:56.514386 1707070 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1124 09:33:56.514481 1707070 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1124 09:33:56.514553 1707070 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1124 09:33:56.514589 1707070 kubeadm.go:319] [certs] Using the existing "sa" key
	I1124 09:33:56.514644 1707070 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1124 09:33:57.046366 1707070 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1124 09:33:57.432965 1707070 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1124 09:33:57.802873 1707070 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1124 09:33:58.414576 1707070 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1124 09:33:58.520825 1707070 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1124 09:33:58.522049 1707070 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1124 09:33:58.526436 1707070 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1124 09:33:58.529676 1707070 out.go:252]   - Booting up control plane ...
	I1124 09:33:58.529779 1707070 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1124 09:33:58.529855 1707070 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1124 09:33:58.529921 1707070 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1124 09:33:58.549683 1707070 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1124 09:33:58.549801 1707070 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1124 09:33:58.557327 1707070 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1124 09:33:58.557589 1707070 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1124 09:33:58.557812 1707070 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1124 09:33:58.696439 1707070 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1124 09:33:58.696553 1707070 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1124 09:37:58.697446 1707070 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.001230859s
	I1124 09:37:58.697472 1707070 kubeadm.go:319] 
	I1124 09:37:58.697558 1707070 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1124 09:37:58.697602 1707070 kubeadm.go:319] 	- The kubelet is not running
	I1124 09:37:58.697730 1707070 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1124 09:37:58.697737 1707070 kubeadm.go:319] 
	I1124 09:37:58.697847 1707070 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1124 09:37:58.697878 1707070 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1124 09:37:58.697921 1707070 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1124 09:37:58.697925 1707070 kubeadm.go:319] 
	I1124 09:37:58.701577 1707070 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1124 09:37:58.701990 1707070 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1124 09:37:58.702104 1707070 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1124 09:37:58.702344 1707070 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1124 09:37:58.702350 1707070 kubeadm.go:319] 
	I1124 09:37:58.702417 1707070 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1124 09:37:58.702481 1707070 kubeadm.go:403] duration metric: took 12m9.652556415s to StartCluster
	I1124 09:37:58.702514 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:37:58.702578 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:37:58.726968 1707070 cri.go:89] found id: ""
	I1124 09:37:58.726981 1707070 logs.go:282] 0 containers: []
	W1124 09:37:58.726988 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:37:58.726994 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:37:58.727055 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:37:58.756184 1707070 cri.go:89] found id: ""
	I1124 09:37:58.756198 1707070 logs.go:282] 0 containers: []
	W1124 09:37:58.756205 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:37:58.756210 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:37:58.756266 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:37:58.781056 1707070 cri.go:89] found id: ""
	I1124 09:37:58.781070 1707070 logs.go:282] 0 containers: []
	W1124 09:37:58.781077 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:37:58.781082 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:37:58.781145 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:37:58.805769 1707070 cri.go:89] found id: ""
	I1124 09:37:58.805783 1707070 logs.go:282] 0 containers: []
	W1124 09:37:58.805790 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:37:58.805796 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:37:58.805854 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:37:58.830758 1707070 cri.go:89] found id: ""
	I1124 09:37:58.830780 1707070 logs.go:282] 0 containers: []
	W1124 09:37:58.830791 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:37:58.830797 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:37:58.830857 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:37:58.855967 1707070 cri.go:89] found id: ""
	I1124 09:37:58.855981 1707070 logs.go:282] 0 containers: []
	W1124 09:37:58.855988 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:37:58.855994 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:37:58.856051 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:37:58.890842 1707070 cri.go:89] found id: ""
	I1124 09:37:58.890857 1707070 logs.go:282] 0 containers: []
	W1124 09:37:58.890865 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:37:58.890873 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:37:58.890885 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:37:58.910142 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:37:58.910157 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:37:58.985463 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:37:58.976283   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:37:58.977104   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:37:58.978904   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:37:58.979496   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:37:58.981268   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:37:58.976283   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:37:58.977104   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:37:58.978904   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:37:58.979496   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:37:58.981268   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:37:58.985474 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:37:58.985486 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:37:59.051823 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:37:59.051845 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:37:59.080123 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:37:59.080139 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	W1124 09:37:59.137954 1707070 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001230859s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	W1124 09:37:59.138000 1707070 out.go:285] * 
	W1124 09:37:59.138117 1707070 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001230859s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1124 09:37:59.138177 1707070 out.go:285] * 
	W1124 09:37:59.140306 1707070 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1124 09:37:59.145839 1707070 out.go:203] 
	W1124 09:37:59.149636 1707070 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001230859s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1124 09:37:59.149678 1707070 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1124 09:37:59.149707 1707070 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1124 09:37:59.153358 1707070 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.268843370Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.268907855Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.268972348Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.269028340Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.269093646Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.269161708Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.269228367Z" level=info msg="runtime interface created"
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.269282981Z" level=info msg="created NRI interface"
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.269381066Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.269475385Z" level=info msg="Connect containerd service"
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.269860021Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this"
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.270611232Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config"
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.281475104Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.281548105Z" level=info msg=serving... address=/run/containerd/containerd.sock
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.281591536Z" level=info msg="Start subscribing containerd event"
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.281638691Z" level=info msg="Start recovering state"
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.310177719Z" level=info msg="Start event monitor"
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.310369614Z" level=info msg="Start cni network conf syncer for default"
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.310437783Z" level=info msg="Start streaming server"
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.310546157Z" level=info msg="Registered namespace \"k8s.io\" with NRI"
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.310605341Z" level=info msg="runtime interface starting up..."
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.310661563Z" level=info msg="starting plugins..."
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.310723160Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Nov 24 09:25:47 functional-291288 systemd[1]: Started containerd.service - containerd container runtime.
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.312804699Z" level=info msg="containerd successfully booted in 0.067611s"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:38:02.843467   21839 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:38:02.844269   21839 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:38:02.845364   21839 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:38:02.845712   21839 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:38:02.846925   21839 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Nov24 08:20] overlayfs: idmapped layers are currently not supported
	[Nov24 08:21] overlayfs: idmapped layers are currently not supported
	[Nov24 08:22] overlayfs: idmapped layers are currently not supported
	[Nov24 08:23] overlayfs: idmapped layers are currently not supported
	[Nov24 08:24] overlayfs: idmapped layers are currently not supported
	[ +19.566509] overlayfs: idmapped layers are currently not supported
	[Nov24 08:25] overlayfs: idmapped layers are currently not supported
	[ +35.934095] overlayfs: idmapped layers are currently not supported
	[Nov24 08:27] overlayfs: idmapped layers are currently not supported
	[Nov24 08:28] overlayfs: idmapped layers are currently not supported
	[Nov24 08:29] overlayfs: idmapped layers are currently not supported
	[Nov24 08:30] overlayfs: idmapped layers are currently not supported
	[Nov24 08:31] overlayfs: idmapped layers are currently not supported
	[ +44.505180] overlayfs: idmapped layers are currently not supported
	[Nov24 08:32] overlayfs: idmapped layers are currently not supported
	[Nov24 08:33] overlayfs: idmapped layers are currently not supported
	[Nov24 08:34] overlayfs: idmapped layers are currently not supported
	[ +21.137877] overlayfs: idmapped layers are currently not supported
	[ +28.724175] overlayfs: idmapped layers are currently not supported
	[Nov24 08:36] overlayfs: idmapped layers are currently not supported
	[Nov24 08:37] overlayfs: idmapped layers are currently not supported
	[Nov24 08:38] overlayfs: idmapped layers are currently not supported
	[  +4.063834] overlayfs: idmapped layers are currently not supported
	[Nov24 08:40] overlayfs: idmapped layers are currently not supported
	[Nov24 08:42] kauditd_printk_skb: 8 callbacks suppressed
	
	
	==> kernel <==
	 09:38:02 up  8:20,  0 user,  load average: 0.04, 0.14, 0.31
	Linux functional-291288 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Nov 24 09:37:59 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:38:00 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 322.
	Nov 24 09:38:00 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:38:00 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:38:00 functional-291288 kubelet[21659]: E1124 09:38:00.453131   21659 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:38:00 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:38:00 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:38:01 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 323.
	Nov 24 09:38:01 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:38:01 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:38:01 functional-291288 kubelet[21716]: E1124 09:38:01.215652   21716 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:38:01 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:38:01 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:38:01 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 324.
	Nov 24 09:38:01 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:38:01 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:38:01 functional-291288 kubelet[21750]: E1124 09:38:01.941996   21750 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:38:01 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:38:01 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:38:02 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 325.
	Nov 24 09:38:02 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:38:02 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:38:02 functional-291288 kubelet[21801]: E1124 09:38:02.697097   21801 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:38:02 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:38:02 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-291288 -n functional-291288
helpers_test.go:262: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-291288 -n functional-291288: exit status 2 (320.233395ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:262: status error: exit status 2 (may be ok)
helpers_test.go:264: "functional-291288" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ComponentHealth (2.24s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/InvalidService (0.06s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/InvalidService
functional_test.go:2326: (dbg) Run:  kubectl --context functional-291288 apply -f testdata/invalidsvc.yaml
functional_test.go:2326: (dbg) Non-zero exit: kubectl --context functional-291288 apply -f testdata/invalidsvc.yaml: exit status 1 (61.195462ms)

                                                
                                                
** stderr ** 
	error: error validating "testdata/invalidsvc.yaml": error validating data: failed to download openapi: Get "https://192.168.49.2:8441/openapi/v2?timeout=32s": dial tcp 192.168.49.2:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false

                                                
                                                
** /stderr **
functional_test.go:2328: kubectl --context functional-291288 apply -f testdata/invalidsvc.yaml failed: exit status 1
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/InvalidService (0.06s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DashboardCmd (1.73s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DashboardCmd
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DashboardCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DashboardCmd
functional_test.go:920: (dbg) daemon: [out/minikube-linux-arm64 dashboard --url --port 36195 -p functional-291288 --alsologtostderr -v=1]
functional_test.go:933: output didn't produce a URL
functional_test.go:925: (dbg) stopping [out/minikube-linux-arm64 dashboard --url --port 36195 -p functional-291288 --alsologtostderr -v=1] ...
functional_test.go:925: (dbg) [out/minikube-linux-arm64 dashboard --url --port 36195 -p functional-291288 --alsologtostderr -v=1] stdout:
functional_test.go:925: (dbg) [out/minikube-linux-arm64 dashboard --url --port 36195 -p functional-291288 --alsologtostderr -v=1] stderr:
I1124 09:39:57.202778 1725904 out.go:360] Setting OutFile to fd 1 ...
I1124 09:39:57.202895 1725904 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1124 09:39:57.202904 1725904 out.go:374] Setting ErrFile to fd 2...
I1124 09:39:57.202909 1725904 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1124 09:39:57.203222 1725904 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
I1124 09:39:57.203477 1725904 mustload.go:66] Loading cluster: functional-291288
I1124 09:39:57.203910 1725904 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
I1124 09:39:57.204414 1725904 cli_runner.go:164] Run: docker container inspect functional-291288 --format={{.State.Status}}
I1124 09:39:57.223262 1725904 host.go:66] Checking if "functional-291288" exists ...
I1124 09:39:57.223621 1725904 cli_runner.go:164] Run: docker system info --format "{{json .}}"
I1124 09:39:57.282391 1725904 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-11-24 09:39:57.272478276 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aa
rch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pa
th:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
I1124 09:39:57.282539 1725904 api_server.go:166] Checking apiserver status ...
I1124 09:39:57.282601 1725904 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
I1124 09:39:57.282644 1725904 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
I1124 09:39:57.299921 1725904 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
W1124 09:39:57.408171 1725904 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
stdout:

                                                
                                                
stderr:
I1124 09:39:57.411423 1725904 out.go:179] * The control-plane node functional-291288 apiserver is not running: (state=Stopped)
I1124 09:39:57.414349 1725904 out.go:179]   To start a cluster, run: "minikube start -p functional-291288"
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DashboardCmd]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DashboardCmd]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect functional-291288
helpers_test.go:243: (dbg) docker inspect functional-291288:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52",
	        "Created": "2025-11-24T09:10:51.896020191Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 1695240,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-11-24T09:10:51.968983407Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:572c983e466f1f784136812eef5cc59ac623db764bc7704d3676c4643993fd08",
	        "ResolvConfPath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/hostname",
	        "HostsPath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/hosts",
	        "LogPath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52-json.log",
	        "Name": "/functional-291288",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "functional-291288:/var",
	                "/lib/modules:/lib/modules:ro"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-291288",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52",
	                "LowerDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7-init/diff:/var/lib/docker/overlay2/bf294786c7ade59cb761c7bdcc27045362c3779b00a569b685443f34ade17737/diff",
	                "MergedDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7/merged",
	                "UpperDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7/diff",
	                "WorkDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-291288",
	                "Source": "/var/lib/docker/volumes/functional-291288/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-291288",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-291288",
	                "name.minikube.sigs.k8s.io": "functional-291288",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "09c1c2eef0dca6362dde63b4cbc372c0cfa3e4fd084b8745043d8b88925691bf",
	            "SandboxKey": "/var/run/docker/netns/09c1c2eef0dc",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34684"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34685"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34688"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34686"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34687"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-291288": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "7e:49:22:0b:f9:2c",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "1e8f91e8ad9f46b831bbb1b0589b0022d940ee9875e64a648dc80612f3ca93dc",
	                    "EndpointID": "5de5ca8ccb07584b21e6e4e30dba12e0233e8d28c3e48e705cddffe75263b337",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-291288",
	                        "70848be15fcc"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-291288 -n functional-291288
helpers_test.go:247: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-291288 -n functional-291288: exit status 2 (319.361875ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:247: status error: exit status 2 (may be ok)
helpers_test.go:252: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DashboardCmd FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DashboardCmd]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 logs -n 25
helpers_test.go:260: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DashboardCmd logs: 
-- stdout --
	
	==> Audit <==
	┌───────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│  COMMAND  │                                                                        ARGS                                                                         │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├───────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ service   │ functional-291288 service hello-node --url                                                                                                          │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	│ mount     │ -p functional-291288 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo1536350623/001:/mount-9p --alsologtostderr -v=1              │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	│ ssh       │ functional-291288 ssh findmnt -T /mount-9p | grep 9p                                                                                                │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	│ ssh       │ functional-291288 ssh findmnt -T /mount-9p | grep 9p                                                                                                │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │ 24 Nov 25 09:39 UTC │
	│ ssh       │ functional-291288 ssh -- ls -la /mount-9p                                                                                                           │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │ 24 Nov 25 09:39 UTC │
	│ ssh       │ functional-291288 ssh cat /mount-9p/test-1763977187076229275                                                                                        │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │ 24 Nov 25 09:39 UTC │
	│ ssh       │ functional-291288 ssh mount | grep 9p; ls -la /mount-9p; cat /mount-9p/pod-dates                                                                    │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	│ ssh       │ functional-291288 ssh sudo umount -f /mount-9p                                                                                                      │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │ 24 Nov 25 09:39 UTC │
	│ ssh       │ functional-291288 ssh findmnt -T /mount-9p | grep 9p                                                                                                │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	│ mount     │ -p functional-291288 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo2133955644/001:/mount-9p --alsologtostderr -v=1 --port 46464 │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	│ ssh       │ functional-291288 ssh findmnt -T /mount-9p | grep 9p                                                                                                │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │ 24 Nov 25 09:39 UTC │
	│ ssh       │ functional-291288 ssh -- ls -la /mount-9p                                                                                                           │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │ 24 Nov 25 09:39 UTC │
	│ ssh       │ functional-291288 ssh sudo umount -f /mount-9p                                                                                                      │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	│ mount     │ -p functional-291288 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo2066364349/001:/mount1 --alsologtostderr -v=1                │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	│ mount     │ -p functional-291288 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo2066364349/001:/mount2 --alsologtostderr -v=1                │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	│ ssh       │ functional-291288 ssh findmnt -T /mount1                                                                                                            │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	│ mount     │ -p functional-291288 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo2066364349/001:/mount3 --alsologtostderr -v=1                │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	│ ssh       │ functional-291288 ssh findmnt -T /mount1                                                                                                            │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │ 24 Nov 25 09:39 UTC │
	│ ssh       │ functional-291288 ssh findmnt -T /mount2                                                                                                            │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │ 24 Nov 25 09:39 UTC │
	│ ssh       │ functional-291288 ssh findmnt -T /mount3                                                                                                            │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │ 24 Nov 25 09:39 UTC │
	│ mount     │ -p functional-291288 --kill=true                                                                                                                    │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	│ start     │ -p functional-291288 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0 │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	│ start     │ -p functional-291288 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0 │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	│ start     │ -p functional-291288 --dry-run --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0           │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	│ dashboard │ --url --port 36195 -p functional-291288 --alsologtostderr -v=1                                                                                      │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	└───────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/11/24 09:39:56
	Running on machine: ip-172-31-30-239
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1124 09:39:56.954354 1725830 out.go:360] Setting OutFile to fd 1 ...
	I1124 09:39:56.954575 1725830 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:39:56.954607 1725830 out.go:374] Setting ErrFile to fd 2...
	I1124 09:39:56.954629 1725830 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:39:56.954894 1725830 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
	I1124 09:39:56.955306 1725830 out.go:368] Setting JSON to false
	I1124 09:39:56.956175 1725830 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":30126,"bootTime":1763947071,"procs":159,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1124 09:39:56.956275 1725830 start.go:143] virtualization:  
	I1124 09:39:56.959619 1725830 out.go:179] * [functional-291288] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1124 09:39:56.962741 1725830 notify.go:221] Checking for updates...
	I1124 09:39:56.963243 1725830 out.go:179]   - MINIKUBE_LOCATION=21978
	I1124 09:39:56.966478 1725830 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1124 09:39:56.969366 1725830 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 09:39:56.972220 1725830 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21978-1652607/.minikube
	I1124 09:39:56.975068 1725830 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1124 09:39:56.977802 1725830 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1124 09:39:56.983230 1725830 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1124 09:39:56.983876 1725830 driver.go:422] Setting default libvirt URI to qemu:///system
	I1124 09:39:57.018751 1725830 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1124 09:39:57.018862 1725830 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 09:39:57.081141 1725830 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-11-24 09:39:57.071744274 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 09:39:57.081249 1725830 docker.go:319] overlay module found
	I1124 09:39:57.084257 1725830 out.go:179] * Using the docker driver based on existing profile
	I1124 09:39:57.086942 1725830 start.go:309] selected driver: docker
	I1124 09:39:57.086961 1725830 start.go:927] validating driver "docker" against &{Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p Mou
ntUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:39:57.087105 1725830 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1124 09:39:57.087221 1725830 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 09:39:57.145343 1725830 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-11-24 09:39:57.136089496 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 09:39:57.145801 1725830 cni.go:84] Creating CNI manager for ""
	I1124 09:39:57.145879 1725830 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1124 09:39:57.145918 1725830 start.go:353] cluster config:
	{Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local C
ontainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disab
leCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:39:57.148910 1725830 out.go:179] * dry-run validation complete!
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Nov 24 09:38:08 functional-291288 containerd[10324]: time="2025-11-24T09:38:08.742098573Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-291288\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:38:09 functional-291288 containerd[10324]: time="2025-11-24T09:38:09.725115511Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-291288\""
	Nov 24 09:38:09 functional-291288 containerd[10324]: time="2025-11-24T09:38:09.727769003Z" level=info msg="ImageDelete event name:\"docker.io/kicbase/echo-server:functional-291288\""
	Nov 24 09:38:09 functional-291288 containerd[10324]: time="2025-11-24T09:38:09.729992993Z" level=info msg="ImageDelete event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\""
	Nov 24 09:38:09 functional-291288 containerd[10324]: time="2025-11-24T09:38:09.740634129Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-291288\" returns successfully"
	Nov 24 09:38:10 functional-291288 containerd[10324]: time="2025-11-24T09:38:10.017725287Z" level=info msg="No images store for sha256:af1a838d2702e4e84137a83a66ae93ebb59c7bf115bf022cc84ce1a55dfd3fb4"
	Nov 24 09:38:10 functional-291288 containerd[10324]: time="2025-11-24T09:38:10.020247594Z" level=info msg="ImageCreate event name:\"docker.io/kicbase/echo-server:functional-291288\""
	Nov 24 09:38:10 functional-291288 containerd[10324]: time="2025-11-24T09:38:10.028698216Z" level=info msg="ImageCreate event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:38:10 functional-291288 containerd[10324]: time="2025-11-24T09:38:10.029232770Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-291288\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:38:11 functional-291288 containerd[10324]: time="2025-11-24T09:38:11.459119625Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-291288\""
	Nov 24 09:38:11 functional-291288 containerd[10324]: time="2025-11-24T09:38:11.462708306Z" level=info msg="ImageDelete event name:\"docker.io/kicbase/echo-server:functional-291288\""
	Nov 24 09:38:11 functional-291288 containerd[10324]: time="2025-11-24T09:38:11.465197440Z" level=info msg="ImageDelete event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\""
	Nov 24 09:38:11 functional-291288 containerd[10324]: time="2025-11-24T09:38:11.482046877Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-291288\" returns successfully"
	Nov 24 09:38:11 functional-291288 containerd[10324]: time="2025-11-24T09:38:11.784805820Z" level=info msg="No images store for sha256:af1a838d2702e4e84137a83a66ae93ebb59c7bf115bf022cc84ce1a55dfd3fb4"
	Nov 24 09:38:11 functional-291288 containerd[10324]: time="2025-11-24T09:38:11.787158164Z" level=info msg="ImageCreate event name:\"docker.io/kicbase/echo-server:functional-291288\""
	Nov 24 09:38:11 functional-291288 containerd[10324]: time="2025-11-24T09:38:11.795127091Z" level=info msg="ImageCreate event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:38:11 functional-291288 containerd[10324]: time="2025-11-24T09:38:11.795603535Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-291288\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:38:12 functional-291288 containerd[10324]: time="2025-11-24T09:38:12.816798467Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-291288\""
	Nov 24 09:38:12 functional-291288 containerd[10324]: time="2025-11-24T09:38:12.819240765Z" level=info msg="ImageDelete event name:\"docker.io/kicbase/echo-server:functional-291288\""
	Nov 24 09:38:12 functional-291288 containerd[10324]: time="2025-11-24T09:38:12.822304091Z" level=info msg="ImageDelete event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\""
	Nov 24 09:38:12 functional-291288 containerd[10324]: time="2025-11-24T09:38:12.835765777Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-291288\" returns successfully"
	Nov 24 09:38:13 functional-291288 containerd[10324]: time="2025-11-24T09:38:13.649607557Z" level=info msg="No images store for sha256:80154cc39374c5be6259fccbd4295ce399d3a1d7b6e10b99200044587775c910"
	Nov 24 09:38:13 functional-291288 containerd[10324]: time="2025-11-24T09:38:13.651890157Z" level=info msg="ImageCreate event name:\"docker.io/kicbase/echo-server:functional-291288\""
	Nov 24 09:38:13 functional-291288 containerd[10324]: time="2025-11-24T09:38:13.659732716Z" level=info msg="ImageCreate event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:38:13 functional-291288 containerd[10324]: time="2025-11-24T09:38:13.660101507Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-291288\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:39:58.471475   24373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:39:58.471962   24373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:39:58.473791   24373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:39:58.474396   24373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:39:58.476042   24373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Nov24 08:20] overlayfs: idmapped layers are currently not supported
	[Nov24 08:21] overlayfs: idmapped layers are currently not supported
	[Nov24 08:22] overlayfs: idmapped layers are currently not supported
	[Nov24 08:23] overlayfs: idmapped layers are currently not supported
	[Nov24 08:24] overlayfs: idmapped layers are currently not supported
	[ +19.566509] overlayfs: idmapped layers are currently not supported
	[Nov24 08:25] overlayfs: idmapped layers are currently not supported
	[ +35.934095] overlayfs: idmapped layers are currently not supported
	[Nov24 08:27] overlayfs: idmapped layers are currently not supported
	[Nov24 08:28] overlayfs: idmapped layers are currently not supported
	[Nov24 08:29] overlayfs: idmapped layers are currently not supported
	[Nov24 08:30] overlayfs: idmapped layers are currently not supported
	[Nov24 08:31] overlayfs: idmapped layers are currently not supported
	[ +44.505180] overlayfs: idmapped layers are currently not supported
	[Nov24 08:32] overlayfs: idmapped layers are currently not supported
	[Nov24 08:33] overlayfs: idmapped layers are currently not supported
	[Nov24 08:34] overlayfs: idmapped layers are currently not supported
	[ +21.137877] overlayfs: idmapped layers are currently not supported
	[ +28.724175] overlayfs: idmapped layers are currently not supported
	[Nov24 08:36] overlayfs: idmapped layers are currently not supported
	[Nov24 08:37] overlayfs: idmapped layers are currently not supported
	[Nov24 08:38] overlayfs: idmapped layers are currently not supported
	[  +4.063834] overlayfs: idmapped layers are currently not supported
	[Nov24 08:40] overlayfs: idmapped layers are currently not supported
	[Nov24 08:42] kauditd_printk_skb: 8 callbacks suppressed
	
	
	==> kernel <==
	 09:39:58 up  8:22,  0 user,  load average: 1.01, 0.45, 0.40
	Linux functional-291288 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Nov 24 09:39:55 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:39:55 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 476.
	Nov 24 09:39:55 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:39:55 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:39:55 functional-291288 kubelet[24234]: E1124 09:39:55.959406   24234 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:39:55 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:39:55 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:39:56 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 477.
	Nov 24 09:39:56 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:39:56 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:39:56 functional-291288 kubelet[24255]: E1124 09:39:56.704403   24255 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:39:56 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:39:56 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:39:57 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 478.
	Nov 24 09:39:57 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:39:57 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:39:57 functional-291288 kubelet[24268]: E1124 09:39:57.436057   24268 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:39:57 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:39:57 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:39:58 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 479.
	Nov 24 09:39:58 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:39:58 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:39:58 functional-291288 kubelet[24298]: E1124 09:39:58.205028   24298 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:39:58 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:39:58 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-291288 -n functional-291288
helpers_test.go:262: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-291288 -n functional-291288: exit status 2 (323.409868ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:262: status error: exit status 2 (may be ok)
helpers_test.go:264: "functional-291288" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DashboardCmd (1.73s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/StatusCmd (3.18s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/StatusCmd
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/StatusCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/StatusCmd
functional_test.go:869: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 status
functional_test.go:869: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-291288 status: exit status 2 (318.648186ms)

                                                
                                                
-- stdout --
	functional-291288
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Stopped
	kubeconfig: Configured
	

                                                
                                                
-- /stdout --
functional_test.go:871: failed to run minikube status. args "out/minikube-linux-arm64 -p functional-291288 status" : exit status 2
functional_test.go:875: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 status -f host:{{.Host}},kublet:{{.Kubelet}},apiserver:{{.APIServer}},kubeconfig:{{.Kubeconfig}}
functional_test.go:875: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-291288 status -f host:{{.Host}},kublet:{{.Kubelet}},apiserver:{{.APIServer}},kubeconfig:{{.Kubeconfig}}: exit status 2 (309.434782ms)

                                                
                                                
-- stdout --
	host:Running,kublet:Stopped,apiserver:Stopped,kubeconfig:Configured

                                                
                                                
-- /stdout --
functional_test.go:877: failed to run minikube status with custom format: args "out/minikube-linux-arm64 -p functional-291288 status -f host:{{.Host}},kublet:{{.Kubelet}},apiserver:{{.APIServer}},kubeconfig:{{.Kubeconfig}}": exit status 2
functional_test.go:887: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 status -o json
functional_test.go:887: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-291288 status -o json: exit status 2 (309.169171ms)

                                                
                                                
-- stdout --
	{"Name":"functional-291288","Host":"Running","Kubelet":"Stopped","APIServer":"Stopped","Kubeconfig":"Configured","Worker":false}

                                                
                                                
-- /stdout --
functional_test.go:889: failed to run minikube status with json output. args "out/minikube-linux-arm64 -p functional-291288 status -o json" : exit status 2
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/StatusCmd]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/StatusCmd]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect functional-291288
helpers_test.go:243: (dbg) docker inspect functional-291288:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52",
	        "Created": "2025-11-24T09:10:51.896020191Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 1695240,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-11-24T09:10:51.968983407Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:572c983e466f1f784136812eef5cc59ac623db764bc7704d3676c4643993fd08",
	        "ResolvConfPath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/hostname",
	        "HostsPath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/hosts",
	        "LogPath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52-json.log",
	        "Name": "/functional-291288",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "functional-291288:/var",
	                "/lib/modules:/lib/modules:ro"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-291288",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52",
	                "LowerDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7-init/diff:/var/lib/docker/overlay2/bf294786c7ade59cb761c7bdcc27045362c3779b00a569b685443f34ade17737/diff",
	                "MergedDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7/merged",
	                "UpperDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7/diff",
	                "WorkDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "volume",
	                "Name": "functional-291288",
	                "Source": "/var/lib/docker/volumes/functional-291288/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            },
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-291288",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-291288",
	                "name.minikube.sigs.k8s.io": "functional-291288",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "09c1c2eef0dca6362dde63b4cbc372c0cfa3e4fd084b8745043d8b88925691bf",
	            "SandboxKey": "/var/run/docker/netns/09c1c2eef0dc",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34684"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34685"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34688"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34686"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34687"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-291288": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "7e:49:22:0b:f9:2c",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "1e8f91e8ad9f46b831bbb1b0589b0022d940ee9875e64a648dc80612f3ca93dc",
	                    "EndpointID": "5de5ca8ccb07584b21e6e4e30dba12e0233e8d28c3e48e705cddffe75263b337",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-291288",
	                        "70848be15fcc"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-291288 -n functional-291288
helpers_test.go:247: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-291288 -n functional-291288: exit status 2 (320.013714ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:247: status error: exit status 2 (may be ok)
helpers_test.go:252: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/StatusCmd FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/StatusCmd]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 logs -n 25
helpers_test.go:260: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/StatusCmd logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                        ARGS                                                                         │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ service │ functional-291288 service list                                                                                                                      │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	│ service │ functional-291288 service list -o json                                                                                                              │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	│ service │ functional-291288 service --namespace=default --https --url hello-node                                                                              │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	│ service │ functional-291288 service hello-node --url --format={{.IP}}                                                                                         │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	│ service │ functional-291288 service hello-node --url                                                                                                          │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	│ mount   │ -p functional-291288 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo1536350623/001:/mount-9p --alsologtostderr -v=1              │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	│ ssh     │ functional-291288 ssh findmnt -T /mount-9p | grep 9p                                                                                                │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	│ ssh     │ functional-291288 ssh findmnt -T /mount-9p | grep 9p                                                                                                │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │ 24 Nov 25 09:39 UTC │
	│ ssh     │ functional-291288 ssh -- ls -la /mount-9p                                                                                                           │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │ 24 Nov 25 09:39 UTC │
	│ ssh     │ functional-291288 ssh cat /mount-9p/test-1763977187076229275                                                                                        │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │ 24 Nov 25 09:39 UTC │
	│ ssh     │ functional-291288 ssh mount | grep 9p; ls -la /mount-9p; cat /mount-9p/pod-dates                                                                    │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	│ ssh     │ functional-291288 ssh sudo umount -f /mount-9p                                                                                                      │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │ 24 Nov 25 09:39 UTC │
	│ ssh     │ functional-291288 ssh findmnt -T /mount-9p | grep 9p                                                                                                │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	│ mount   │ -p functional-291288 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo2133955644/001:/mount-9p --alsologtostderr -v=1 --port 46464 │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	│ ssh     │ functional-291288 ssh findmnt -T /mount-9p | grep 9p                                                                                                │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │ 24 Nov 25 09:39 UTC │
	│ ssh     │ functional-291288 ssh -- ls -la /mount-9p                                                                                                           │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │ 24 Nov 25 09:39 UTC │
	│ ssh     │ functional-291288 ssh sudo umount -f /mount-9p                                                                                                      │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	│ mount   │ -p functional-291288 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo2066364349/001:/mount1 --alsologtostderr -v=1                │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	│ mount   │ -p functional-291288 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo2066364349/001:/mount2 --alsologtostderr -v=1                │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	│ ssh     │ functional-291288 ssh findmnt -T /mount1                                                                                                            │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	│ mount   │ -p functional-291288 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo2066364349/001:/mount3 --alsologtostderr -v=1                │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	│ ssh     │ functional-291288 ssh findmnt -T /mount1                                                                                                            │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │ 24 Nov 25 09:39 UTC │
	│ ssh     │ functional-291288 ssh findmnt -T /mount2                                                                                                            │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │ 24 Nov 25 09:39 UTC │
	│ ssh     │ functional-291288 ssh findmnt -T /mount3                                                                                                            │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │ 24 Nov 25 09:39 UTC │
	│ mount   │ -p functional-291288 --kill=true                                                                                                                    │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	└─────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/11/24 09:25:43
	Running on machine: ip-172-31-30-239
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1124 09:25:43.956868 1707070 out.go:360] Setting OutFile to fd 1 ...
	I1124 09:25:43.957002 1707070 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:25:43.957006 1707070 out.go:374] Setting ErrFile to fd 2...
	I1124 09:25:43.957010 1707070 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:25:43.957247 1707070 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
	I1124 09:25:43.957575 1707070 out.go:368] Setting JSON to false
	I1124 09:25:43.958421 1707070 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":29273,"bootTime":1763947071,"procs":157,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1124 09:25:43.958501 1707070 start.go:143] virtualization:  
	I1124 09:25:43.961954 1707070 out.go:179] * [functional-291288] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1124 09:25:43.965745 1707070 out.go:179]   - MINIKUBE_LOCATION=21978
	I1124 09:25:43.965806 1707070 notify.go:221] Checking for updates...
	I1124 09:25:43.971831 1707070 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1124 09:25:43.974596 1707070 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 09:25:43.977531 1707070 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21978-1652607/.minikube
	I1124 09:25:43.980447 1707070 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1124 09:25:43.983266 1707070 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1124 09:25:43.986897 1707070 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1124 09:25:43.986999 1707070 driver.go:422] Setting default libvirt URI to qemu:///system
	I1124 09:25:44.009686 1707070 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1124 09:25:44.009789 1707070 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 09:25:44.075505 1707070 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-11-24 09:25:44.065719192 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 09:25:44.075607 1707070 docker.go:319] overlay module found
	I1124 09:25:44.080493 1707070 out.go:179] * Using the docker driver based on existing profile
	I1124 09:25:44.083298 1707070 start.go:309] selected driver: docker
	I1124 09:25:44.083323 1707070 start.go:927] validating driver "docker" against &{Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disa
bleCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:25:44.083409 1707070 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1124 09:25:44.083513 1707070 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 09:25:44.137525 1707070 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-11-24 09:25:44.127840235 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 09:25:44.137959 1707070 start_flags.go:992] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1124 09:25:44.137984 1707070 cni.go:84] Creating CNI manager for ""
	I1124 09:25:44.138040 1707070 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1124 09:25:44.138097 1707070 start.go:353] cluster config:
	{Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local C
ontainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disab
leCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:25:44.143064 1707070 out.go:179] * Starting "functional-291288" primary control-plane node in "functional-291288" cluster
	I1124 09:25:44.145761 1707070 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1124 09:25:44.148578 1707070 out.go:179] * Pulling base image v0.0.48-1763789673-21948 ...
	I1124 09:25:44.151418 1707070 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1124 09:25:44.151496 1707070 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f in local docker daemon
	I1124 09:25:44.171581 1707070 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f in local docker daemon, skipping pull
	I1124 09:25:44.171593 1707070 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f exists in daemon, skipping load
	W1124 09:25:44.210575 1707070 preload.go:144] https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.35.0-beta.0/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	W1124 09:25:44.425167 1707070 preload.go:144] https://github.com/kubernetes-sigs/minikube-preloads/releases/download/v18/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	I1124 09:25:44.425335 1707070 profile.go:143] Saving config to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/config.json ...
	I1124 09:25:44.425459 1707070 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:25:44.425602 1707070 cache.go:243] Successfully downloaded all kic artifacts
	I1124 09:25:44.425631 1707070 start.go:360] acquireMachinesLock for functional-291288: {Name:mk85384dc057570e1f34db593d357cea738652c4 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.425681 1707070 start.go:364] duration metric: took 28.381µs to acquireMachinesLock for "functional-291288"
	I1124 09:25:44.425694 1707070 start.go:96] Skipping create...Using existing machine configuration
	I1124 09:25:44.425698 1707070 fix.go:54] fixHost starting: 
	I1124 09:25:44.425962 1707070 cli_runner.go:164] Run: docker container inspect functional-291288 --format={{.State.Status}}
	I1124 09:25:44.443478 1707070 fix.go:112] recreateIfNeeded on functional-291288: state=Running err=<nil>
	W1124 09:25:44.443512 1707070 fix.go:138] unexpected machine state, will restart: <nil>
	I1124 09:25:44.447296 1707070 out.go:252] * Updating the running docker "functional-291288" container ...
	I1124 09:25:44.447326 1707070 machine.go:94] provisionDockerMachine start ...
	I1124 09:25:44.447405 1707070 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:25:44.465953 1707070 main.go:143] libmachine: Using SSH client type: native
	I1124 09:25:44.466284 1707070 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34684 <nil> <nil>}
	I1124 09:25:44.466291 1707070 main.go:143] libmachine: About to run SSH command:
	hostname
	I1124 09:25:44.603673 1707070 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:25:44.618572 1707070 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-291288
	
	I1124 09:25:44.618586 1707070 ubuntu.go:182] provisioning hostname "functional-291288"
	I1124 09:25:44.618668 1707070 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:25:44.659382 1707070 main.go:143] libmachine: Using SSH client type: native
	I1124 09:25:44.659732 1707070 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34684 <nil> <nil>}
	I1124 09:25:44.659741 1707070 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-291288 && echo "functional-291288" | sudo tee /etc/hostname
	I1124 09:25:44.806505 1707070 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:25:44.844189 1707070 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-291288
	
	I1124 09:25:44.844281 1707070 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:25:44.868659 1707070 main.go:143] libmachine: Using SSH client type: native
	I1124 09:25:44.869019 1707070 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34684 <nil> <nil>}
	I1124 09:25:44.869041 1707070 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-291288' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-291288/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-291288' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1124 09:25:44.979106 1707070 cache.go:107] acquiring lock: {Name:mk22a10f0ce1f3295b61e7e76c455d0494a3e278 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.979193 1707070 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 exists
	I1124 09:25:44.979201 1707070 cache.go:96] cache image "gcr.io/k8s-minikube/storage-provisioner:v5" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5" took 127.862µs
	I1124 09:25:44.979207 1707070 cache.go:80] save to tar file gcr.io/k8s-minikube/storage-provisioner:v5 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 succeeded
	I1124 09:25:44.979198 1707070 cache.go:107] acquiring lock: {Name:mk80fdbe7cdb5bc17c2a82b4ecfd00214559a435 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.979218 1707070 cache.go:107] acquiring lock: {Name:mk85f1502dbb97830776608fb729eb3605e112e6 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.979237 1707070 cache.go:107] acquiring lock: {Name:mk46ce3b59d7e062b3dbc8a90fe5b4231f256471 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.979267 1707070 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 exists
	I1124 09:25:44.979266 1707070 cache.go:107] acquiring lock: {Name:mk1cf42e67442503a46c578224bd3cb68bf682d4 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.979273 1707070 cache.go:96] cache image "registry.k8s.io/pause:3.10.1" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1" took 55.992µs
	I1124 09:25:44.979277 1707070 cache.go:80] save to tar file registry.k8s.io/pause:3.10.1 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 succeeded
	I1124 09:25:44.979285 1707070 cache.go:107] acquiring lock: {Name:mk726502cb84c177b2e14fee88512325761511c5 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.979301 1707070 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 exists
	I1124 09:25:44.979310 1707070 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 exists
	I1124 09:25:44.979308 1707070 cache.go:96] cache image "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0" took 43.274µs
	I1124 09:25:44.979314 1707070 cache.go:96] cache image "registry.k8s.io/coredns/coredns:v1.13.1" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1" took 29.982µs
	I1124 09:25:44.979319 1707070 cache.go:80] save to tar file registry.k8s.io/coredns/coredns:v1.13.1 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 succeeded
	I1124 09:25:44.979319 1707070 cache.go:80] save to tar file registry.k8s.io/kube-apiserver:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 succeeded
	I1124 09:25:44.979326 1707070 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0 exists
	I1124 09:25:44.979330 1707070 cache.go:96] cache image "registry.k8s.io/etcd:3.5.24-0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0" took 94.392µs
	I1124 09:25:44.979336 1707070 cache.go:80] save to tar file registry.k8s.io/etcd:3.5.24-0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0 succeeded
	I1124 09:25:44.979330 1707070 cache.go:107] acquiring lock: {Name:mkfdc49c8e68aee34cee0c9d441ae8a4dca675c6 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.979345 1707070 cache.go:107] acquiring lock: {Name:mkdbf38e05e2c47c1a7a906a2236e9e7020a94c5 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.979364 1707070 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 exists
	I1124 09:25:44.979370 1707070 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 exists
	I1124 09:25:44.979368 1707070 cache.go:96] cache image "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0" took 40.427µs
	I1124 09:25:44.979373 1707070 cache.go:96] cache image "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0" took 29.49µs
	I1124 09:25:44.979375 1707070 cache.go:80] save to tar file registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 succeeded
	I1124 09:25:44.979378 1707070 cache.go:80] save to tar file registry.k8s.io/kube-scheduler:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 succeeded
	I1124 09:25:44.979407 1707070 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 exists
	I1124 09:25:44.979413 1707070 cache.go:96] cache image "registry.k8s.io/kube-proxy:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0" took 225.709µs
	I1124 09:25:44.979418 1707070 cache.go:80] save to tar file registry.k8s.io/kube-proxy:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 succeeded
	I1124 09:25:44.979424 1707070 cache.go:87] Successfully saved all images to host disk.
	I1124 09:25:45.028668 1707070 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1124 09:25:45.028686 1707070 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/21978-1652607/.minikube CaCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/21978-1652607/.minikube}
	I1124 09:25:45.028706 1707070 ubuntu.go:190] setting up certificates
	I1124 09:25:45.028727 1707070 provision.go:84] configureAuth start
	I1124 09:25:45.028800 1707070 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-291288
	I1124 09:25:45.083635 1707070 provision.go:143] copyHostCerts
	I1124 09:25:45.083709 1707070 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem, removing ...
	I1124 09:25:45.083718 1707070 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem
	I1124 09:25:45.083806 1707070 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem (1679 bytes)
	I1124 09:25:45.083920 1707070 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem, removing ...
	I1124 09:25:45.083924 1707070 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem
	I1124 09:25:45.083951 1707070 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem (1078 bytes)
	I1124 09:25:45.084006 1707070 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem, removing ...
	I1124 09:25:45.084009 1707070 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem
	I1124 09:25:45.084038 1707070 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem (1123 bytes)
	I1124 09:25:45.084083 1707070 provision.go:117] generating server cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem org=jenkins.functional-291288 san=[127.0.0.1 192.168.49.2 functional-291288 localhost minikube]
	I1124 09:25:45.498574 1707070 provision.go:177] copyRemoteCerts
	I1124 09:25:45.498637 1707070 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1124 09:25:45.498677 1707070 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:25:45.520187 1707070 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:25:45.626724 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1124 09:25:45.644660 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1124 09:25:45.663269 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1124 09:25:45.681392 1707070 provision.go:87] duration metric: took 652.643227ms to configureAuth
	I1124 09:25:45.681410 1707070 ubuntu.go:206] setting minikube options for container-runtime
	I1124 09:25:45.681611 1707070 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1124 09:25:45.681617 1707070 machine.go:97] duration metric: took 1.234286229s to provisionDockerMachine
	I1124 09:25:45.681624 1707070 start.go:293] postStartSetup for "functional-291288" (driver="docker")
	I1124 09:25:45.681634 1707070 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1124 09:25:45.681687 1707070 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1124 09:25:45.681727 1707070 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:25:45.698790 1707070 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:25:45.802503 1707070 ssh_runner.go:195] Run: cat /etc/os-release
	I1124 09:25:45.805922 1707070 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1124 09:25:45.805944 1707070 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1124 09:25:45.805954 1707070 filesync.go:126] Scanning /home/jenkins/minikube-integration/21978-1652607/.minikube/addons for local assets ...
	I1124 09:25:45.806011 1707070 filesync.go:126] Scanning /home/jenkins/minikube-integration/21978-1652607/.minikube/files for local assets ...
	I1124 09:25:45.806087 1707070 filesync.go:149] local asset: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem -> 16544672.pem in /etc/ssl/certs
	I1124 09:25:45.806167 1707070 filesync.go:149] local asset: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/test/nested/copy/1654467/hosts -> hosts in /etc/test/nested/copy/1654467
	I1124 09:25:45.806257 1707070 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/1654467
	I1124 09:25:45.814093 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem --> /etc/ssl/certs/16544672.pem (1708 bytes)
	I1124 09:25:45.832308 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/test/nested/copy/1654467/hosts --> /etc/test/nested/copy/1654467/hosts (40 bytes)
	I1124 09:25:45.850625 1707070 start.go:296] duration metric: took 168.9873ms for postStartSetup
	I1124 09:25:45.850696 1707070 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1124 09:25:45.850734 1707070 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:25:45.868479 1707070 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:25:45.971382 1707070 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1124 09:25:45.976655 1707070 fix.go:56] duration metric: took 1.550948262s for fixHost
	I1124 09:25:45.976671 1707070 start.go:83] releasing machines lock for "functional-291288", held for 1.550982815s
	I1124 09:25:45.976739 1707070 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-291288
	I1124 09:25:45.997505 1707070 ssh_runner.go:195] Run: cat /version.json
	I1124 09:25:45.997527 1707070 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1124 09:25:45.997550 1707070 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:25:45.997588 1707070 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:25:46.017321 1707070 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:25:46.018732 1707070 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:25:46.118131 1707070 ssh_runner.go:195] Run: systemctl --version
	I1124 09:25:46.213854 1707070 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1124 09:25:46.218087 1707070 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1124 09:25:46.218149 1707070 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1124 09:25:46.225944 1707070 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1124 09:25:46.225958 1707070 start.go:496] detecting cgroup driver to use...
	I1124 09:25:46.225989 1707070 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1124 09:25:46.226035 1707070 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1124 09:25:46.241323 1707070 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1124 09:25:46.254720 1707070 docker.go:218] disabling cri-docker service (if available) ...
	I1124 09:25:46.254789 1707070 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1124 09:25:46.270340 1707070 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1124 09:25:46.283549 1707070 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1124 09:25:46.399926 1707070 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1124 09:25:46.515234 1707070 docker.go:234] disabling docker service ...
	I1124 09:25:46.515290 1707070 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1124 09:25:46.529899 1707070 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1124 09:25:46.543047 1707070 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1124 09:25:46.658532 1707070 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1124 09:25:46.775880 1707070 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1124 09:25:46.790551 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1124 09:25:46.806411 1707070 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:25:46.967053 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1124 09:25:46.977583 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1124 09:25:46.986552 1707070 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1124 09:25:46.986618 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1124 09:25:46.995635 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1124 09:25:47.005680 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1124 09:25:47.015425 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1124 09:25:47.024808 1707070 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1124 09:25:47.033022 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1124 09:25:47.041980 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1124 09:25:47.051362 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1124 09:25:47.060469 1707070 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1124 09:25:47.068004 1707070 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1124 09:25:47.075326 1707070 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1124 09:25:47.191217 1707070 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1124 09:25:47.313892 1707070 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1124 09:25:47.313955 1707070 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1124 09:25:47.318001 1707070 start.go:564] Will wait 60s for crictl version
	I1124 09:25:47.318060 1707070 ssh_runner.go:195] Run: which crictl
	I1124 09:25:47.321766 1707070 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1124 09:25:47.347974 1707070 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.1.5
	RuntimeApiVersion:  v1
	I1124 09:25:47.348042 1707070 ssh_runner.go:195] Run: containerd --version
	I1124 09:25:47.369074 1707070 ssh_runner.go:195] Run: containerd --version
	I1124 09:25:47.394675 1707070 out.go:179] * Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	I1124 09:25:47.397593 1707070 cli_runner.go:164] Run: docker network inspect functional-291288 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1124 09:25:47.412872 1707070 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1124 09:25:47.419437 1707070 out.go:179]   - apiserver.enable-admission-plugins=NamespaceAutoProvision
	I1124 09:25:47.422135 1707070 kubeadm.go:884] updating cluster {Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikub
eCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker Bina
ryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1124 09:25:47.422352 1707070 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:25:47.578507 1707070 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:25:47.745390 1707070 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:25:47.894887 1707070 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1124 09:25:47.894982 1707070 ssh_runner.go:195] Run: sudo crictl images --output json
	I1124 09:25:47.919585 1707070 containerd.go:627] all images are preloaded for containerd runtime.
	I1124 09:25:47.919604 1707070 cache_images.go:86] Images are preloaded, skipping loading
	I1124 09:25:47.919612 1707070 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-beta.0 containerd true true} ...
	I1124 09:25:47.919707 1707070 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-291288 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1124 09:25:47.919778 1707070 ssh_runner.go:195] Run: sudo crictl info
	I1124 09:25:47.948265 1707070 extraconfig.go:125] Overwriting default enable-admission-plugins=NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota with user provided enable-admission-plugins=NamespaceAutoProvision for component apiserver
	I1124 09:25:47.948285 1707070 cni.go:84] Creating CNI manager for ""
	I1124 09:25:47.948293 1707070 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1124 09:25:47.948308 1707070 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1124 09:25:47.948331 1707070 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-beta.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-291288 NodeName:functional-291288 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceAutoProvision] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false Kubel
etConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1124 09:25:47.948441 1707070 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-291288"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceAutoProvision"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-beta.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1124 09:25:47.948507 1707070 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0
	I1124 09:25:47.956183 1707070 binaries.go:51] Found k8s binaries, skipping transfer
	I1124 09:25:47.956246 1707070 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1124 09:25:47.963641 1707070 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (328 bytes)
	I1124 09:25:47.976586 1707070 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (359 bytes)
	I1124 09:25:47.989056 1707070 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2087 bytes)
	I1124 09:25:48.003961 1707070 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1124 09:25:48.011533 1707070 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1124 09:25:48.134407 1707070 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1124 09:25:48.383061 1707070 certs.go:69] Setting up /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288 for IP: 192.168.49.2
	I1124 09:25:48.383072 1707070 certs.go:195] generating shared ca certs ...
	I1124 09:25:48.383086 1707070 certs.go:227] acquiring lock for ca certs: {Name:mkbe540a30c4376a351176f7fe6fec044d058b09 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 09:25:48.383238 1707070 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.key
	I1124 09:25:48.383279 1707070 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.key
	I1124 09:25:48.383286 1707070 certs.go:257] generating profile certs ...
	I1124 09:25:48.383366 1707070 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.key
	I1124 09:25:48.383420 1707070 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.key.5acb2515
	I1124 09:25:48.383456 1707070 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.key
	I1124 09:25:48.383562 1707070 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467.pem (1338 bytes)
	W1124 09:25:48.383598 1707070 certs.go:480] ignoring /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467_empty.pem, impossibly tiny 0 bytes
	I1124 09:25:48.383605 1707070 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem (1671 bytes)
	I1124 09:25:48.383632 1707070 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem (1078 bytes)
	I1124 09:25:48.383655 1707070 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem (1123 bytes)
	I1124 09:25:48.383684 1707070 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem (1679 bytes)
	I1124 09:25:48.383730 1707070 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem (1708 bytes)
	I1124 09:25:48.384294 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1124 09:25:48.403533 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1124 09:25:48.421212 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1124 09:25:48.441887 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1124 09:25:48.462311 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1124 09:25:48.480889 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1124 09:25:48.499086 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1124 09:25:48.517112 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1124 09:25:48.535554 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem --> /usr/share/ca-certificates/16544672.pem (1708 bytes)
	I1124 09:25:48.553310 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1124 09:25:48.571447 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467.pem --> /usr/share/ca-certificates/1654467.pem (1338 bytes)
	I1124 09:25:48.589094 1707070 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1124 09:25:48.602393 1707070 ssh_runner.go:195] Run: openssl version
	I1124 09:25:48.608953 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/16544672.pem && ln -fs /usr/share/ca-certificates/16544672.pem /etc/ssl/certs/16544672.pem"
	I1124 09:25:48.617886 1707070 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/16544672.pem
	I1124 09:25:48.621697 1707070 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Nov 24 09:10 /usr/share/ca-certificates/16544672.pem
	I1124 09:25:48.621756 1707070 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/16544672.pem
	I1124 09:25:48.663214 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/16544672.pem /etc/ssl/certs/3ec20f2e.0"
	I1124 09:25:48.671328 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1124 09:25:48.679977 1707070 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:25:48.683961 1707070 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Nov 24 08:44 /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:25:48.684024 1707070 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:25:48.725273 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1124 09:25:48.733278 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1654467.pem && ln -fs /usr/share/ca-certificates/1654467.pem /etc/ssl/certs/1654467.pem"
	I1124 09:25:48.741887 1707070 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1654467.pem
	I1124 09:25:48.745440 1707070 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Nov 24 09:10 /usr/share/ca-certificates/1654467.pem
	I1124 09:25:48.745500 1707070 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1654467.pem
	I1124 09:25:48.791338 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1654467.pem /etc/ssl/certs/51391683.0"
	I1124 09:25:48.799503 1707070 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1124 09:25:48.803145 1707070 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1124 09:25:48.844016 1707070 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1124 09:25:48.884962 1707070 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1124 09:25:48.926044 1707070 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1124 09:25:48.967289 1707070 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1124 09:25:49.008697 1707070 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1124 09:25:49.049934 1707070 kubeadm.go:401] StartCluster: {Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikubeCA
APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryM
irror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:25:49.050012 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1124 09:25:49.050074 1707070 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1124 09:25:49.080420 1707070 cri.go:89] found id: ""
	I1124 09:25:49.080484 1707070 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1124 09:25:49.088364 1707070 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1124 09:25:49.088374 1707070 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1124 09:25:49.088425 1707070 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1124 09:25:49.095680 1707070 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1124 09:25:49.096194 1707070 kubeconfig.go:125] found "functional-291288" server: "https://192.168.49.2:8441"
	I1124 09:25:49.097500 1707070 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1124 09:25:49.105267 1707070 kubeadm.go:645] detected kubeadm config drift (will reconfigure cluster from new /var/tmp/minikube/kubeadm.yaml):
	-- stdout --
	--- /var/tmp/minikube/kubeadm.yaml	2025-11-24 09:11:10.138797725 +0000
	+++ /var/tmp/minikube/kubeadm.yaml.new	2025-11-24 09:25:47.995648074 +0000
	@@ -24,7 +24,7 @@
	   certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	   extraArgs:
	     - name: "enable-admission-plugins"
	-      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	+      value: "NamespaceAutoProvision"
	 controllerManager:
	   extraArgs:
	     - name: "allocate-node-cidrs"
	
	-- /stdout --
	I1124 09:25:49.105285 1707070 kubeadm.go:1161] stopping kube-system containers ...
	I1124 09:25:49.105296 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name: Namespaces:[kube-system]}
	I1124 09:25:49.105351 1707070 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1124 09:25:49.142256 1707070 cri.go:89] found id: ""
	I1124 09:25:49.142317 1707070 ssh_runner.go:195] Run: sudo systemctl stop kubelet
	I1124 09:25:49.162851 1707070 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1124 09:25:49.170804 1707070 kubeadm.go:158] found existing configuration files:
	-rw------- 1 root root 5635 Nov 24 09:15 /etc/kubernetes/admin.conf
	-rw------- 1 root root 5636 Nov 24 09:15 /etc/kubernetes/controller-manager.conf
	-rw------- 1 root root 5672 Nov 24 09:15 /etc/kubernetes/kubelet.conf
	-rw------- 1 root root 5584 Nov 24 09:15 /etc/kubernetes/scheduler.conf
	
	I1124 09:25:49.170876 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1124 09:25:49.178603 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1124 09:25:49.185907 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1124 09:25:49.185964 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1124 09:25:49.193453 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1124 09:25:49.200815 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1124 09:25:49.200869 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1124 09:25:49.208328 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1124 09:25:49.215968 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1124 09:25:49.216025 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1124 09:25:49.223400 1707070 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1124 09:25:49.230953 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase certs all --config /var/tmp/minikube/kubeadm.yaml"
	I1124 09:25:49.277779 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml"
	I1124 09:25:50.308934 1707070 ssh_runner.go:235] Completed: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml": (1.031131442s)
	I1124 09:25:50.308993 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubelet-start --config /var/tmp/minikube/kubeadm.yaml"
	I1124 09:25:50.511648 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase control-plane all --config /var/tmp/minikube/kubeadm.yaml"
	I1124 09:25:50.576653 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase etcd local --config /var/tmp/minikube/kubeadm.yaml"
	I1124 09:25:50.625775 1707070 api_server.go:52] waiting for apiserver process to appear ...
	I1124 09:25:50.625855 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:51.126713 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:51.625939 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:52.126677 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:52.626053 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:53.126113 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:53.626972 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:54.126493 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:54.626036 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:55.126171 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:55.626853 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:56.126041 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:56.626177 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:57.126019 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:57.626847 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:58.126017 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:58.626716 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:59.125997 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:59.626367 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:00.125951 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:00.626013 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:01.126844 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:01.626038 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:02.126420 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:02.626727 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:03.126582 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:03.626068 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:04.126304 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:04.626830 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:05.126754 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:05.625961 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:06.126197 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:06.626039 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:07.126915 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:07.626052 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:08.126281 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:08.626116 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:09.126574 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:09.626068 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:10.125978 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:10.626328 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:11.126416 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:11.626073 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:12.126027 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:12.626174 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:13.126044 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:13.626781 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:14.126849 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:14.626203 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:15.125957 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:15.626068 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:16.126934 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:16.626382 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:17.126245 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:17.626034 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:18.126745 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:18.626942 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:19.126393 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:19.626607 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:20.126050 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:20.626732 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:21.126049 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:21.626115 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:22.125988 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:22.626261 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:23.126293 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:23.626107 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:24.126971 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:24.626009 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:25.126859 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:25.626876 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:26.126041 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:26.625983 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:27.126168 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:27.626079 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:28.126047 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:28.626761 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:29.126598 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:29.626290 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:30.125941 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:30.626102 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:31.126717 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:31.626588 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:32.126223 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:32.626875 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:33.126051 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:33.625963 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:34.126808 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:34.626621 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:35.126147 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:35.626018 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:36.126039 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:36.625970 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:37.126579 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:37.626198 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:38.126718 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:38.626386 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:39.126159 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:39.626590 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:40.126050 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:40.626422 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:41.126600 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:41.626097 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:42.127732 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:42.626108 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:43.126855 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:43.626202 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:44.126380 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:44.626423 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:45.127019 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:45.626257 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:46.125911 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:46.626125 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:47.126026 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:47.626915 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:48.126322 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:48.626706 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:49.126864 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:49.627009 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:50.126375 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:50.626418 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:26:50.626521 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:26:50.654529 1707070 cri.go:89] found id: ""
	I1124 09:26:50.654543 1707070 logs.go:282] 0 containers: []
	W1124 09:26:50.654550 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:26:50.654555 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:26:50.654624 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:26:50.683038 1707070 cri.go:89] found id: ""
	I1124 09:26:50.683052 1707070 logs.go:282] 0 containers: []
	W1124 09:26:50.683059 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:26:50.683064 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:26:50.683121 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:26:50.711396 1707070 cri.go:89] found id: ""
	I1124 09:26:50.711410 1707070 logs.go:282] 0 containers: []
	W1124 09:26:50.711422 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:26:50.711433 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:26:50.711498 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:26:50.735435 1707070 cri.go:89] found id: ""
	I1124 09:26:50.735449 1707070 logs.go:282] 0 containers: []
	W1124 09:26:50.735457 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:26:50.735463 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:26:50.735520 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:26:50.760437 1707070 cri.go:89] found id: ""
	I1124 09:26:50.760451 1707070 logs.go:282] 0 containers: []
	W1124 09:26:50.760458 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:26:50.760464 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:26:50.760520 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:26:50.785555 1707070 cri.go:89] found id: ""
	I1124 09:26:50.785576 1707070 logs.go:282] 0 containers: []
	W1124 09:26:50.785584 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:26:50.785590 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:26:50.785662 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:26:50.810261 1707070 cri.go:89] found id: ""
	I1124 09:26:50.810278 1707070 logs.go:282] 0 containers: []
	W1124 09:26:50.810286 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:26:50.810294 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:26:50.810305 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:26:50.879322 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:26:50.870488   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:50.871030   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:50.872890   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:50.873352   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:50.875005   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:26:50.870488   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:50.871030   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:50.872890   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:50.873352   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:50.875005   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:26:50.879334 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:26:50.879345 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:26:50.941117 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:26:50.941140 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:26:50.969259 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:26:50.969275 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:26:51.024741 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:26:51.024763 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:26:53.542977 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:53.553083 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:26:53.553155 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:26:53.577781 1707070 cri.go:89] found id: ""
	I1124 09:26:53.577795 1707070 logs.go:282] 0 containers: []
	W1124 09:26:53.577802 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:26:53.577808 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:26:53.577866 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:26:53.604191 1707070 cri.go:89] found id: ""
	I1124 09:26:53.604205 1707070 logs.go:282] 0 containers: []
	W1124 09:26:53.604212 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:26:53.604217 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:26:53.604277 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:26:53.632984 1707070 cri.go:89] found id: ""
	I1124 09:26:53.632998 1707070 logs.go:282] 0 containers: []
	W1124 09:26:53.633004 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:26:53.633010 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:26:53.633071 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:26:53.663828 1707070 cri.go:89] found id: ""
	I1124 09:26:53.663842 1707070 logs.go:282] 0 containers: []
	W1124 09:26:53.663850 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:26:53.663856 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:26:53.663912 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:26:53.695173 1707070 cri.go:89] found id: ""
	I1124 09:26:53.695187 1707070 logs.go:282] 0 containers: []
	W1124 09:26:53.695195 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:26:53.695200 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:26:53.695259 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:26:53.719882 1707070 cri.go:89] found id: ""
	I1124 09:26:53.719897 1707070 logs.go:282] 0 containers: []
	W1124 09:26:53.719904 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:26:53.719910 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:26:53.719993 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:26:53.753006 1707070 cri.go:89] found id: ""
	I1124 09:26:53.753020 1707070 logs.go:282] 0 containers: []
	W1124 09:26:53.753038 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:26:53.753046 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:26:53.753057 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:26:53.810839 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:26:53.810864 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:26:53.828132 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:26:53.828149 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:26:53.893802 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:26:53.885327   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:53.886130   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:53.888016   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:53.888539   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:53.890056   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:26:53.885327   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:53.886130   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:53.888016   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:53.888539   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:53.890056   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:26:53.893815 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:26:53.893825 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:26:53.955840 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:26:53.955860 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:26:56.485625 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:56.495752 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:26:56.495812 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:26:56.523600 1707070 cri.go:89] found id: ""
	I1124 09:26:56.523614 1707070 logs.go:282] 0 containers: []
	W1124 09:26:56.523622 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:26:56.523627 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:26:56.523730 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:26:56.547432 1707070 cri.go:89] found id: ""
	I1124 09:26:56.547445 1707070 logs.go:282] 0 containers: []
	W1124 09:26:56.547453 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:26:56.547465 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:26:56.547522 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:26:56.571895 1707070 cri.go:89] found id: ""
	I1124 09:26:56.571909 1707070 logs.go:282] 0 containers: []
	W1124 09:26:56.571917 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:26:56.571922 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:26:56.571977 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:26:56.596624 1707070 cri.go:89] found id: ""
	I1124 09:26:56.596637 1707070 logs.go:282] 0 containers: []
	W1124 09:26:56.596644 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:26:56.596650 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:26:56.596705 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:26:56.621497 1707070 cri.go:89] found id: ""
	I1124 09:26:56.621511 1707070 logs.go:282] 0 containers: []
	W1124 09:26:56.621518 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:26:56.621523 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:26:56.621588 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:26:56.656808 1707070 cri.go:89] found id: ""
	I1124 09:26:56.656822 1707070 logs.go:282] 0 containers: []
	W1124 09:26:56.656829 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:26:56.656834 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:26:56.656891 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:26:56.693750 1707070 cri.go:89] found id: ""
	I1124 09:26:56.693763 1707070 logs.go:282] 0 containers: []
	W1124 09:26:56.693770 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:26:56.693778 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:26:56.693799 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:26:56.711624 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:26:56.711642 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:26:56.772006 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:26:56.764543   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:56.764946   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:56.766216   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:56.766780   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:56.768382   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:26:56.764543   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:56.764946   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:56.766216   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:56.766780   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:56.768382   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:26:56.772020 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:26:56.772030 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:26:56.832784 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:26:56.832805 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:26:56.862164 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:26:56.862179 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:26:59.417328 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:59.427445 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:26:59.427506 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:26:59.451539 1707070 cri.go:89] found id: ""
	I1124 09:26:59.451574 1707070 logs.go:282] 0 containers: []
	W1124 09:26:59.451582 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:26:59.451588 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:26:59.451647 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:26:59.476110 1707070 cri.go:89] found id: ""
	I1124 09:26:59.476124 1707070 logs.go:282] 0 containers: []
	W1124 09:26:59.476131 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:26:59.476137 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:26:59.476194 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:26:59.504520 1707070 cri.go:89] found id: ""
	I1124 09:26:59.504533 1707070 logs.go:282] 0 containers: []
	W1124 09:26:59.504540 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:26:59.504546 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:26:59.504607 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:26:59.529647 1707070 cri.go:89] found id: ""
	I1124 09:26:59.529662 1707070 logs.go:282] 0 containers: []
	W1124 09:26:59.529669 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:26:59.529674 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:26:59.529753 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:26:59.558904 1707070 cri.go:89] found id: ""
	I1124 09:26:59.558918 1707070 logs.go:282] 0 containers: []
	W1124 09:26:59.558925 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:26:59.558930 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:26:59.558999 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:26:59.583698 1707070 cri.go:89] found id: ""
	I1124 09:26:59.583712 1707070 logs.go:282] 0 containers: []
	W1124 09:26:59.583733 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:26:59.583738 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:26:59.583800 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:26:59.607605 1707070 cri.go:89] found id: ""
	I1124 09:26:59.607619 1707070 logs.go:282] 0 containers: []
	W1124 09:26:59.607626 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:26:59.607634 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:26:59.607645 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:26:59.624446 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:26:59.624462 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:26:59.711588 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:26:59.701837   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:59.703242   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:59.704228   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:59.706009   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:59.706513   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:26:59.701837   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:59.703242   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:59.704228   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:59.706009   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:59.706513   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:26:59.711600 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:26:59.711610 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:26:59.777617 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:26:59.777638 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:26:59.810868 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:26:59.810888 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:02.368395 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:02.379444 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:02.379503 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:02.403995 1707070 cri.go:89] found id: ""
	I1124 09:27:02.404009 1707070 logs.go:282] 0 containers: []
	W1124 09:27:02.404017 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:02.404022 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:02.404080 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:02.428532 1707070 cri.go:89] found id: ""
	I1124 09:27:02.428546 1707070 logs.go:282] 0 containers: []
	W1124 09:27:02.428553 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:02.428559 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:02.428623 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:02.455148 1707070 cri.go:89] found id: ""
	I1124 09:27:02.455162 1707070 logs.go:282] 0 containers: []
	W1124 09:27:02.455169 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:02.455174 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:02.455233 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:02.479942 1707070 cri.go:89] found id: ""
	I1124 09:27:02.479957 1707070 logs.go:282] 0 containers: []
	W1124 09:27:02.479969 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:02.479975 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:02.480034 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:02.505728 1707070 cri.go:89] found id: ""
	I1124 09:27:02.505744 1707070 logs.go:282] 0 containers: []
	W1124 09:27:02.505751 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:02.505760 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:02.505845 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:02.536863 1707070 cri.go:89] found id: ""
	I1124 09:27:02.536881 1707070 logs.go:282] 0 containers: []
	W1124 09:27:02.536889 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:02.536894 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:02.536960 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:02.566083 1707070 cri.go:89] found id: ""
	I1124 09:27:02.566107 1707070 logs.go:282] 0 containers: []
	W1124 09:27:02.566124 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:02.566132 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:02.566142 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:02.628402 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:02.628423 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:02.669505 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:02.669523 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:02.737879 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:02.737907 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:02.755317 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:02.755334 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:02.820465 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:02.811248   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:02.812608   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:02.813513   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:02.815318   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:02.815727   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:02.811248   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:02.812608   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:02.813513   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:02.815318   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:02.815727   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:05.320749 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:05.331020 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:05.331081 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:05.355889 1707070 cri.go:89] found id: ""
	I1124 09:27:05.355904 1707070 logs.go:282] 0 containers: []
	W1124 09:27:05.355912 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:05.355917 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:05.355980 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:05.381650 1707070 cri.go:89] found id: ""
	I1124 09:27:05.381664 1707070 logs.go:282] 0 containers: []
	W1124 09:27:05.381671 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:05.381676 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:05.381733 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:05.410311 1707070 cri.go:89] found id: ""
	I1124 09:27:05.410325 1707070 logs.go:282] 0 containers: []
	W1124 09:27:05.410332 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:05.410337 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:05.410396 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:05.434601 1707070 cri.go:89] found id: ""
	I1124 09:27:05.434615 1707070 logs.go:282] 0 containers: []
	W1124 09:27:05.434621 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:05.434627 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:05.434684 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:05.459196 1707070 cri.go:89] found id: ""
	I1124 09:27:05.459210 1707070 logs.go:282] 0 containers: []
	W1124 09:27:05.459218 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:05.459223 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:05.459294 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:05.483433 1707070 cri.go:89] found id: ""
	I1124 09:27:05.483448 1707070 logs.go:282] 0 containers: []
	W1124 09:27:05.483455 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:05.483460 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:05.483523 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:05.508072 1707070 cri.go:89] found id: ""
	I1124 09:27:05.508086 1707070 logs.go:282] 0 containers: []
	W1124 09:27:05.508093 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:05.508101 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:05.508111 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:05.563733 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:05.563752 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:05.584705 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:05.584736 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:05.666380 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:05.657873   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:05.658740   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:05.660432   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:05.660828   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:05.662363   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:05.657873   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:05.658740   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:05.660432   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:05.660828   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:05.662363   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:05.666394 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:05.666405 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:05.738526 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:05.738548 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:08.268404 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:08.278347 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:08.278408 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:08.303562 1707070 cri.go:89] found id: ""
	I1124 09:27:08.303577 1707070 logs.go:282] 0 containers: []
	W1124 09:27:08.303585 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:08.303590 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:08.303651 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:08.329886 1707070 cri.go:89] found id: ""
	I1124 09:27:08.329900 1707070 logs.go:282] 0 containers: []
	W1124 09:27:08.329907 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:08.329913 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:08.329971 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:08.355081 1707070 cri.go:89] found id: ""
	I1124 09:27:08.355096 1707070 logs.go:282] 0 containers: []
	W1124 09:27:08.355104 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:08.355110 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:08.355175 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:08.381511 1707070 cri.go:89] found id: ""
	I1124 09:27:08.381534 1707070 logs.go:282] 0 containers: []
	W1124 09:27:08.381543 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:08.381549 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:08.381620 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:08.410606 1707070 cri.go:89] found id: ""
	I1124 09:27:08.410629 1707070 logs.go:282] 0 containers: []
	W1124 09:27:08.410637 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:08.410642 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:08.410700 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:08.434980 1707070 cri.go:89] found id: ""
	I1124 09:27:08.434994 1707070 logs.go:282] 0 containers: []
	W1124 09:27:08.435001 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:08.435007 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:08.435064 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:08.463780 1707070 cri.go:89] found id: ""
	I1124 09:27:08.463793 1707070 logs.go:282] 0 containers: []
	W1124 09:27:08.463800 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:08.463808 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:08.463819 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:08.527201 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:08.518614   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:08.519320   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:08.521220   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:08.521832   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:08.523649   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:08.518614   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:08.519320   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:08.521220   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:08.521832   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:08.523649   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:08.527213 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:08.527223 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:08.591559 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:08.591581 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:08.619107 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:08.619125 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:08.678658 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:08.678675 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:11.199028 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:11.209463 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:11.209529 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:11.236040 1707070 cri.go:89] found id: ""
	I1124 09:27:11.236061 1707070 logs.go:282] 0 containers: []
	W1124 09:27:11.236069 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:11.236075 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:11.236145 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:11.263895 1707070 cri.go:89] found id: ""
	I1124 09:27:11.263906 1707070 logs.go:282] 0 containers: []
	W1124 09:27:11.263912 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:11.263917 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:11.263968 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:11.290492 1707070 cri.go:89] found id: ""
	I1124 09:27:11.290507 1707070 logs.go:282] 0 containers: []
	W1124 09:27:11.290514 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:11.290519 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:11.290575 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:11.316763 1707070 cri.go:89] found id: ""
	I1124 09:27:11.316778 1707070 logs.go:282] 0 containers: []
	W1124 09:27:11.316785 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:11.316791 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:11.316899 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:11.340653 1707070 cri.go:89] found id: ""
	I1124 09:27:11.340668 1707070 logs.go:282] 0 containers: []
	W1124 09:27:11.340675 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:11.340680 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:11.340741 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:11.365000 1707070 cri.go:89] found id: ""
	I1124 09:27:11.365013 1707070 logs.go:282] 0 containers: []
	W1124 09:27:11.365020 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:11.365026 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:11.365086 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:11.393012 1707070 cri.go:89] found id: ""
	I1124 09:27:11.393025 1707070 logs.go:282] 0 containers: []
	W1124 09:27:11.393033 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:11.393041 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:11.393053 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:11.409740 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:11.409758 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:11.474068 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:11.465242   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:11.466095   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:11.467959   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:11.468588   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:11.470448   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:11.465242   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:11.466095   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:11.467959   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:11.468588   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:11.470448   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:11.474079 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:11.474089 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:11.535411 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:11.535433 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:11.565626 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:11.565645 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:14.123823 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:14.133770 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:14.133829 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:14.157476 1707070 cri.go:89] found id: ""
	I1124 09:27:14.157490 1707070 logs.go:282] 0 containers: []
	W1124 09:27:14.157497 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:14.157503 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:14.157562 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:14.188747 1707070 cri.go:89] found id: ""
	I1124 09:27:14.188761 1707070 logs.go:282] 0 containers: []
	W1124 09:27:14.188768 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:14.188773 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:14.188830 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:14.216257 1707070 cri.go:89] found id: ""
	I1124 09:27:14.216271 1707070 logs.go:282] 0 containers: []
	W1124 09:27:14.216279 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:14.216284 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:14.216345 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:14.241336 1707070 cri.go:89] found id: ""
	I1124 09:27:14.241349 1707070 logs.go:282] 0 containers: []
	W1124 09:27:14.241357 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:14.241362 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:14.241423 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:14.265223 1707070 cri.go:89] found id: ""
	I1124 09:27:14.265238 1707070 logs.go:282] 0 containers: []
	W1124 09:27:14.265245 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:14.265250 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:14.265312 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:14.292087 1707070 cri.go:89] found id: ""
	I1124 09:27:14.292101 1707070 logs.go:282] 0 containers: []
	W1124 09:27:14.292108 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:14.292114 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:14.292171 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:14.316839 1707070 cri.go:89] found id: ""
	I1124 09:27:14.316854 1707070 logs.go:282] 0 containers: []
	W1124 09:27:14.316861 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:14.316869 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:14.316879 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:14.371692 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:14.371715 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:14.388964 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:14.388980 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:14.455069 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:14.447375   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:14.448018   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:14.449517   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:14.449819   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:14.451683   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:14.447375   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:14.448018   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:14.449517   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:14.449819   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:14.451683   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:14.455080 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:14.455090 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:14.518102 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:14.518124 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:17.045537 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:17.055937 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:17.056004 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:17.084357 1707070 cri.go:89] found id: ""
	I1124 09:27:17.084370 1707070 logs.go:282] 0 containers: []
	W1124 09:27:17.084378 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:17.084383 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:17.084439 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:17.112022 1707070 cri.go:89] found id: ""
	I1124 09:27:17.112035 1707070 logs.go:282] 0 containers: []
	W1124 09:27:17.112043 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:17.112048 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:17.112110 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:17.135317 1707070 cri.go:89] found id: ""
	I1124 09:27:17.135331 1707070 logs.go:282] 0 containers: []
	W1124 09:27:17.135338 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:17.135343 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:17.135399 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:17.163850 1707070 cri.go:89] found id: ""
	I1124 09:27:17.163865 1707070 logs.go:282] 0 containers: []
	W1124 09:27:17.163872 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:17.163878 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:17.163933 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:17.188915 1707070 cri.go:89] found id: ""
	I1124 09:27:17.188929 1707070 logs.go:282] 0 containers: []
	W1124 09:27:17.188936 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:17.188941 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:17.188997 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:17.217448 1707070 cri.go:89] found id: ""
	I1124 09:27:17.217461 1707070 logs.go:282] 0 containers: []
	W1124 09:27:17.217475 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:17.217480 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:17.217537 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:17.242521 1707070 cri.go:89] found id: ""
	I1124 09:27:17.242536 1707070 logs.go:282] 0 containers: []
	W1124 09:27:17.242543 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:17.242551 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:17.242561 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:17.297899 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:17.297921 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:17.315278 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:17.315297 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:17.377620 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:17.368489   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:17.368893   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:17.370596   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:17.371050   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:17.372486   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:17.368489   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:17.368893   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:17.370596   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:17.371050   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:17.372486   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:17.377640 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:17.377651 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:17.439884 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:17.439907 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:19.969337 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:19.979536 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:19.979595 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:20.018198 1707070 cri.go:89] found id: ""
	I1124 09:27:20.018220 1707070 logs.go:282] 0 containers: []
	W1124 09:27:20.018229 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:20.018235 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:20.018297 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:20.046055 1707070 cri.go:89] found id: ""
	I1124 09:27:20.046070 1707070 logs.go:282] 0 containers: []
	W1124 09:27:20.046077 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:20.046082 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:20.046158 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:20.078159 1707070 cri.go:89] found id: ""
	I1124 09:27:20.078183 1707070 logs.go:282] 0 containers: []
	W1124 09:27:20.078191 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:20.078197 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:20.078289 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:20.104136 1707070 cri.go:89] found id: ""
	I1124 09:27:20.104151 1707070 logs.go:282] 0 containers: []
	W1124 09:27:20.104158 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:20.104164 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:20.104228 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:20.130266 1707070 cri.go:89] found id: ""
	I1124 09:27:20.130280 1707070 logs.go:282] 0 containers: []
	W1124 09:27:20.130288 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:20.130293 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:20.130352 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:20.156899 1707070 cri.go:89] found id: ""
	I1124 09:27:20.156913 1707070 logs.go:282] 0 containers: []
	W1124 09:27:20.156921 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:20.156926 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:20.156986 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:20.182706 1707070 cri.go:89] found id: ""
	I1124 09:27:20.182721 1707070 logs.go:282] 0 containers: []
	W1124 09:27:20.182728 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:20.182736 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:20.182747 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:20.240720 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:20.240740 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:20.257971 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:20.257987 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:20.324806 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:20.316231   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:20.316929   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:20.317881   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:20.319464   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:20.319918   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:20.316231   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:20.316929   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:20.317881   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:20.319464   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:20.319918   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:20.324827 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:20.324838 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:20.386188 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:20.386212 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:22.915679 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:22.927190 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:22.927254 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:22.959235 1707070 cri.go:89] found id: ""
	I1124 09:27:22.959249 1707070 logs.go:282] 0 containers: []
	W1124 09:27:22.959256 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:22.959262 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:22.959318 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:22.986124 1707070 cri.go:89] found id: ""
	I1124 09:27:22.986138 1707070 logs.go:282] 0 containers: []
	W1124 09:27:22.986146 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:22.986151 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:22.986206 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:23.016094 1707070 cri.go:89] found id: ""
	I1124 09:27:23.016108 1707070 logs.go:282] 0 containers: []
	W1124 09:27:23.016116 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:23.016121 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:23.016183 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:23.044417 1707070 cri.go:89] found id: ""
	I1124 09:27:23.044431 1707070 logs.go:282] 0 containers: []
	W1124 09:27:23.044439 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:23.044444 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:23.044501 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:23.069468 1707070 cri.go:89] found id: ""
	I1124 09:27:23.069484 1707070 logs.go:282] 0 containers: []
	W1124 09:27:23.069491 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:23.069497 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:23.069556 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:23.096521 1707070 cri.go:89] found id: ""
	I1124 09:27:23.096535 1707070 logs.go:282] 0 containers: []
	W1124 09:27:23.096542 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:23.096548 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:23.096605 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:23.125327 1707070 cri.go:89] found id: ""
	I1124 09:27:23.125342 1707070 logs.go:282] 0 containers: []
	W1124 09:27:23.125349 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:23.125358 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:23.125367 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:23.180584 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:23.180605 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:23.197372 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:23.197388 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:23.259943 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:23.251679   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:23.252410   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:23.253306   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:23.254866   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:23.255334   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:23.251679   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:23.252410   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:23.253306   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:23.254866   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:23.255334   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:23.259953 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:23.259965 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:23.325045 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:23.325066 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:25.855733 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:25.866329 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:25.866395 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:25.906494 1707070 cri.go:89] found id: ""
	I1124 09:27:25.906508 1707070 logs.go:282] 0 containers: []
	W1124 09:27:25.906516 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:25.906521 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:25.906590 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:25.945205 1707070 cri.go:89] found id: ""
	I1124 09:27:25.945229 1707070 logs.go:282] 0 containers: []
	W1124 09:27:25.945237 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:25.945242 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:25.945301 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:25.970721 1707070 cri.go:89] found id: ""
	I1124 09:27:25.970736 1707070 logs.go:282] 0 containers: []
	W1124 09:27:25.970743 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:25.970749 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:25.970807 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:25.997334 1707070 cri.go:89] found id: ""
	I1124 09:27:25.997348 1707070 logs.go:282] 0 containers: []
	W1124 09:27:25.997355 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:25.997364 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:25.997438 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:26.029916 1707070 cri.go:89] found id: ""
	I1124 09:27:26.029932 1707070 logs.go:282] 0 containers: []
	W1124 09:27:26.029940 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:26.029945 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:26.030007 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:26.057466 1707070 cri.go:89] found id: ""
	I1124 09:27:26.057480 1707070 logs.go:282] 0 containers: []
	W1124 09:27:26.057488 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:26.057494 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:26.057565 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:26.083489 1707070 cri.go:89] found id: ""
	I1124 09:27:26.083503 1707070 logs.go:282] 0 containers: []
	W1124 09:27:26.083511 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:26.083519 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:26.083529 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:26.140569 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:26.140588 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:26.158554 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:26.158571 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:26.230573 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:26.222615   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:26.223218   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:26.224819   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:26.225472   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:26.226976   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:26.222615   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:26.223218   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:26.224819   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:26.225472   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:26.226976   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:26.230583 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:26.230594 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:26.292417 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:26.292436 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:28.819944 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:28.830528 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:28.830587 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:28.854228 1707070 cri.go:89] found id: ""
	I1124 09:27:28.854243 1707070 logs.go:282] 0 containers: []
	W1124 09:27:28.854250 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:28.854260 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:28.854324 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:28.891203 1707070 cri.go:89] found id: ""
	I1124 09:27:28.891217 1707070 logs.go:282] 0 containers: []
	W1124 09:27:28.891224 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:28.891230 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:28.891305 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:28.918573 1707070 cri.go:89] found id: ""
	I1124 09:27:28.918587 1707070 logs.go:282] 0 containers: []
	W1124 09:27:28.918594 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:28.918600 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:28.918665 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:28.944672 1707070 cri.go:89] found id: ""
	I1124 09:27:28.944685 1707070 logs.go:282] 0 containers: []
	W1124 09:27:28.944692 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:28.944708 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:28.944763 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:28.970414 1707070 cri.go:89] found id: ""
	I1124 09:27:28.970429 1707070 logs.go:282] 0 containers: []
	W1124 09:27:28.970436 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:28.970441 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:28.970539 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:28.995438 1707070 cri.go:89] found id: ""
	I1124 09:27:28.995453 1707070 logs.go:282] 0 containers: []
	W1124 09:27:28.995460 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:28.995466 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:28.995526 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:29.023817 1707070 cri.go:89] found id: ""
	I1124 09:27:29.023832 1707070 logs.go:282] 0 containers: []
	W1124 09:27:29.023839 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:29.023847 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:29.023858 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:29.080316 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:29.080336 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:29.097486 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:29.097502 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:29.159875 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:29.151793   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:29.152163   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:29.153608   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:29.154019   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:29.155829   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:29.151793   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:29.152163   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:29.153608   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:29.154019   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:29.155829   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:29.159888 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:29.159907 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:29.223729 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:29.223754 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:31.751641 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:31.761798 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:31.761859 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:31.788691 1707070 cri.go:89] found id: ""
	I1124 09:27:31.788705 1707070 logs.go:282] 0 containers: []
	W1124 09:27:31.788711 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:31.788717 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:31.788776 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:31.812359 1707070 cri.go:89] found id: ""
	I1124 09:27:31.812374 1707070 logs.go:282] 0 containers: []
	W1124 09:27:31.812382 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:31.812387 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:31.812450 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:31.837276 1707070 cri.go:89] found id: ""
	I1124 09:27:31.837289 1707070 logs.go:282] 0 containers: []
	W1124 09:27:31.837296 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:31.837302 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:31.837360 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:31.862818 1707070 cri.go:89] found id: ""
	I1124 09:27:31.862832 1707070 logs.go:282] 0 containers: []
	W1124 09:27:31.862840 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:31.862846 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:31.862903 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:31.904922 1707070 cri.go:89] found id: ""
	I1124 09:27:31.904936 1707070 logs.go:282] 0 containers: []
	W1124 09:27:31.904944 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:31.904950 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:31.905012 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:31.949580 1707070 cri.go:89] found id: ""
	I1124 09:27:31.949594 1707070 logs.go:282] 0 containers: []
	W1124 09:27:31.949601 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:31.949607 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:31.949661 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:31.975157 1707070 cri.go:89] found id: ""
	I1124 09:27:31.975171 1707070 logs.go:282] 0 containers: []
	W1124 09:27:31.975178 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:31.975187 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:31.975198 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:32.004216 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:32.004239 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:32.064444 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:32.064466 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:32.084210 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:32.084229 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:32.152949 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:32.144237   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:32.145124   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:32.147159   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:32.147890   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:32.148900   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:32.144237   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:32.145124   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:32.147159   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:32.147890   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:32.148900   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:32.152963 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:32.152975 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:34.714493 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:34.725033 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:34.725101 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:34.750339 1707070 cri.go:89] found id: ""
	I1124 09:27:34.750352 1707070 logs.go:282] 0 containers: []
	W1124 09:27:34.750359 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:34.750365 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:34.750422 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:34.776574 1707070 cri.go:89] found id: ""
	I1124 09:27:34.776588 1707070 logs.go:282] 0 containers: []
	W1124 09:27:34.776595 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:34.776600 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:34.776656 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:34.801274 1707070 cri.go:89] found id: ""
	I1124 09:27:34.801288 1707070 logs.go:282] 0 containers: []
	W1124 09:27:34.801295 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:34.801300 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:34.801355 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:34.828204 1707070 cri.go:89] found id: ""
	I1124 09:27:34.828217 1707070 logs.go:282] 0 containers: []
	W1124 09:27:34.828224 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:34.828230 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:34.828286 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:34.856488 1707070 cri.go:89] found id: ""
	I1124 09:27:34.856502 1707070 logs.go:282] 0 containers: []
	W1124 09:27:34.856509 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:34.856514 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:34.856571 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:34.882889 1707070 cri.go:89] found id: ""
	I1124 09:27:34.882903 1707070 logs.go:282] 0 containers: []
	W1124 09:27:34.882914 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:34.882919 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:34.882988 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:34.914562 1707070 cri.go:89] found id: ""
	I1124 09:27:34.914576 1707070 logs.go:282] 0 containers: []
	W1124 09:27:34.914583 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:34.914591 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:34.914601 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:34.981562 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:34.981596 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:34.998925 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:34.998941 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:35.070877 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:35.062206   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:35.063028   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:35.064710   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:35.065308   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:35.067060   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:35.062206   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:35.063028   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:35.064710   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:35.065308   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:35.067060   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:35.070899 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:35.070909 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:35.137172 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:35.137193 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:37.666865 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:37.677121 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:37.677182 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:37.702376 1707070 cri.go:89] found id: ""
	I1124 09:27:37.702390 1707070 logs.go:282] 0 containers: []
	W1124 09:27:37.702398 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:37.702407 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:37.702491 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:37.727342 1707070 cri.go:89] found id: ""
	I1124 09:27:37.727355 1707070 logs.go:282] 0 containers: []
	W1124 09:27:37.727363 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:37.727368 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:37.727430 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:37.753323 1707070 cri.go:89] found id: ""
	I1124 09:27:37.753336 1707070 logs.go:282] 0 containers: []
	W1124 09:27:37.753343 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:37.753349 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:37.753409 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:37.781020 1707070 cri.go:89] found id: ""
	I1124 09:27:37.781041 1707070 logs.go:282] 0 containers: []
	W1124 09:27:37.781049 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:37.781055 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:37.781117 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:37.805925 1707070 cri.go:89] found id: ""
	I1124 09:27:37.805939 1707070 logs.go:282] 0 containers: []
	W1124 09:27:37.805946 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:37.805952 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:37.806013 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:37.833036 1707070 cri.go:89] found id: ""
	I1124 09:27:37.833062 1707070 logs.go:282] 0 containers: []
	W1124 09:27:37.833069 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:37.833075 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:37.833140 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:37.860115 1707070 cri.go:89] found id: ""
	I1124 09:27:37.860129 1707070 logs.go:282] 0 containers: []
	W1124 09:27:37.860137 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:37.860145 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:37.860156 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:37.926098 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:37.926118 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:37.960030 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:37.960045 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:38.019375 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:38.019395 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:38.039066 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:38.039085 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:38.110062 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:38.101570   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:38.102692   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:38.104495   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:38.105053   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:38.106366   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:38.101570   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:38.102692   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:38.104495   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:38.105053   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:38.106366   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:40.610482 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:40.620402 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:40.620472 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:40.648289 1707070 cri.go:89] found id: ""
	I1124 09:27:40.648303 1707070 logs.go:282] 0 containers: []
	W1124 09:27:40.648311 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:40.648317 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:40.648373 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:40.672588 1707070 cri.go:89] found id: ""
	I1124 09:27:40.672603 1707070 logs.go:282] 0 containers: []
	W1124 09:27:40.672610 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:40.672616 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:40.672673 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:40.700039 1707070 cri.go:89] found id: ""
	I1124 09:27:40.700053 1707070 logs.go:282] 0 containers: []
	W1124 09:27:40.700060 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:40.700066 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:40.700129 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:40.728494 1707070 cri.go:89] found id: ""
	I1124 09:27:40.728508 1707070 logs.go:282] 0 containers: []
	W1124 09:27:40.728516 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:40.728522 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:40.728582 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:40.753773 1707070 cri.go:89] found id: ""
	I1124 09:27:40.753786 1707070 logs.go:282] 0 containers: []
	W1124 09:27:40.753793 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:40.753798 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:40.753860 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:40.778243 1707070 cri.go:89] found id: ""
	I1124 09:27:40.778257 1707070 logs.go:282] 0 containers: []
	W1124 09:27:40.778264 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:40.778270 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:40.778333 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:40.804316 1707070 cri.go:89] found id: ""
	I1124 09:27:40.804329 1707070 logs.go:282] 0 containers: []
	W1124 09:27:40.804350 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:40.804358 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:40.804370 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:40.821314 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:40.821330 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:40.901213 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:40.878028   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:40.878824   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:40.894654   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:40.895170   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:40.896920   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:40.878028   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:40.878824   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:40.894654   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:40.895170   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:40.896920   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:40.901232 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:40.901242 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:40.972785 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:40.972806 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:41.000947 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:41.000967 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:43.560416 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:43.570821 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:43.570882 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:43.595557 1707070 cri.go:89] found id: ""
	I1124 09:27:43.595571 1707070 logs.go:282] 0 containers: []
	W1124 09:27:43.595579 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:43.595585 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:43.595640 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:43.623980 1707070 cri.go:89] found id: ""
	I1124 09:27:43.623996 1707070 logs.go:282] 0 containers: []
	W1124 09:27:43.624003 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:43.624008 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:43.624074 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:43.649674 1707070 cri.go:89] found id: ""
	I1124 09:27:43.649688 1707070 logs.go:282] 0 containers: []
	W1124 09:27:43.649695 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:43.649701 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:43.649758 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:43.673375 1707070 cri.go:89] found id: ""
	I1124 09:27:43.673388 1707070 logs.go:282] 0 containers: []
	W1124 09:27:43.673397 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:43.673403 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:43.673459 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:43.700917 1707070 cri.go:89] found id: ""
	I1124 09:27:43.700931 1707070 logs.go:282] 0 containers: []
	W1124 09:27:43.700938 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:43.700943 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:43.701000 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:43.725453 1707070 cri.go:89] found id: ""
	I1124 09:27:43.725467 1707070 logs.go:282] 0 containers: []
	W1124 09:27:43.725481 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:43.725487 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:43.725557 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:43.755304 1707070 cri.go:89] found id: ""
	I1124 09:27:43.755318 1707070 logs.go:282] 0 containers: []
	W1124 09:27:43.755326 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:43.755335 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:43.755346 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:43.772549 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:43.772567 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:43.837565 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:43.829378   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:43.829969   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:43.831587   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:43.832265   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:43.833938   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:43.829378   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:43.829969   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:43.831587   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:43.832265   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:43.833938   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:43.837575 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:43.837587 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:43.898949 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:43.898969 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:43.934259 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:43.934277 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:46.497111 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:46.507177 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:46.507251 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:46.531012 1707070 cri.go:89] found id: ""
	I1124 09:27:46.531025 1707070 logs.go:282] 0 containers: []
	W1124 09:27:46.531032 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:46.531038 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:46.531101 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:46.555781 1707070 cri.go:89] found id: ""
	I1124 09:27:46.555795 1707070 logs.go:282] 0 containers: []
	W1124 09:27:46.555802 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:46.555807 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:46.555864 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:46.580956 1707070 cri.go:89] found id: ""
	I1124 09:27:46.580974 1707070 logs.go:282] 0 containers: []
	W1124 09:27:46.580982 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:46.580987 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:46.581055 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:46.606320 1707070 cri.go:89] found id: ""
	I1124 09:27:46.606333 1707070 logs.go:282] 0 containers: []
	W1124 09:27:46.606340 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:46.606346 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:46.606414 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:46.632671 1707070 cri.go:89] found id: ""
	I1124 09:27:46.632685 1707070 logs.go:282] 0 containers: []
	W1124 09:27:46.632692 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:46.632697 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:46.632755 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:46.656948 1707070 cri.go:89] found id: ""
	I1124 09:27:46.656962 1707070 logs.go:282] 0 containers: []
	W1124 09:27:46.656969 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:46.656975 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:46.657037 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:46.681897 1707070 cri.go:89] found id: ""
	I1124 09:27:46.681910 1707070 logs.go:282] 0 containers: []
	W1124 09:27:46.681917 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:46.681925 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:46.681936 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:46.698822 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:46.698839 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:46.763473 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:46.755294   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:46.755864   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:46.757448   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:46.758065   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:46.759847   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:46.755294   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:46.755864   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:46.757448   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:46.758065   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:46.759847   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:46.763499 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:46.763510 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:46.826271 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:46.826293 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:46.855001 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:46.855017 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:49.412865 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:49.423511 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:49.423574 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:49.447618 1707070 cri.go:89] found id: ""
	I1124 09:27:49.447632 1707070 logs.go:282] 0 containers: []
	W1124 09:27:49.447639 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:49.447645 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:49.447705 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:49.476127 1707070 cri.go:89] found id: ""
	I1124 09:27:49.476140 1707070 logs.go:282] 0 containers: []
	W1124 09:27:49.476147 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:49.476154 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:49.476213 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:49.501684 1707070 cri.go:89] found id: ""
	I1124 09:27:49.501697 1707070 logs.go:282] 0 containers: []
	W1124 09:27:49.501705 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:49.501711 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:49.501771 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:49.527011 1707070 cri.go:89] found id: ""
	I1124 09:27:49.527025 1707070 logs.go:282] 0 containers: []
	W1124 09:27:49.527033 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:49.527038 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:49.527098 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:49.552026 1707070 cri.go:89] found id: ""
	I1124 09:27:49.552040 1707070 logs.go:282] 0 containers: []
	W1124 09:27:49.552047 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:49.552053 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:49.552110 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:49.582162 1707070 cri.go:89] found id: ""
	I1124 09:27:49.582189 1707070 logs.go:282] 0 containers: []
	W1124 09:27:49.582196 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:49.582202 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:49.582275 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:49.612653 1707070 cri.go:89] found id: ""
	I1124 09:27:49.612667 1707070 logs.go:282] 0 containers: []
	W1124 09:27:49.612675 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:49.612683 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:49.612693 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:49.668483 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:49.668504 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:49.685463 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:49.685480 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:49.750076 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:49.741868   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:49.742309   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:49.744083   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:49.744608   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:49.746375   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:49.741868   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:49.742309   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:49.744083   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:49.744608   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:49.746375   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:49.750136 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:49.750148 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:49.811614 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:49.811634 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:52.341239 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:52.351722 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:52.351784 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:52.378388 1707070 cri.go:89] found id: ""
	I1124 09:27:52.378402 1707070 logs.go:282] 0 containers: []
	W1124 09:27:52.378410 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:52.378416 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:52.378498 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:52.404052 1707070 cri.go:89] found id: ""
	I1124 09:27:52.404067 1707070 logs.go:282] 0 containers: []
	W1124 09:27:52.404074 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:52.404079 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:52.404138 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:52.428854 1707070 cri.go:89] found id: ""
	I1124 09:27:52.428868 1707070 logs.go:282] 0 containers: []
	W1124 09:27:52.428876 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:52.428882 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:52.428945 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:52.460795 1707070 cri.go:89] found id: ""
	I1124 09:27:52.460808 1707070 logs.go:282] 0 containers: []
	W1124 09:27:52.460815 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:52.460825 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:52.460886 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:52.490351 1707070 cri.go:89] found id: ""
	I1124 09:27:52.490365 1707070 logs.go:282] 0 containers: []
	W1124 09:27:52.490372 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:52.490378 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:52.490438 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:52.515789 1707070 cri.go:89] found id: ""
	I1124 09:27:52.515804 1707070 logs.go:282] 0 containers: []
	W1124 09:27:52.515811 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:52.515816 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:52.515874 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:52.544304 1707070 cri.go:89] found id: ""
	I1124 09:27:52.544318 1707070 logs.go:282] 0 containers: []
	W1124 09:27:52.544326 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:52.544335 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:52.544347 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:52.611718 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:52.603411   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:52.604016   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:52.605628   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:52.606175   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:52.607864   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:52.603411   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:52.604016   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:52.605628   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:52.606175   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:52.607864   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:52.611731 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:52.611743 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:52.679720 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:52.679740 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:52.708422 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:52.708437 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:52.766414 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:52.766433 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:55.285861 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:55.296023 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:55.296086 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:55.324396 1707070 cri.go:89] found id: ""
	I1124 09:27:55.324409 1707070 logs.go:282] 0 containers: []
	W1124 09:27:55.324417 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:55.324422 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:55.324478 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:55.348746 1707070 cri.go:89] found id: ""
	I1124 09:27:55.348760 1707070 logs.go:282] 0 containers: []
	W1124 09:27:55.348767 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:55.348773 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:55.348832 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:55.373685 1707070 cri.go:89] found id: ""
	I1124 09:27:55.373710 1707070 logs.go:282] 0 containers: []
	W1124 09:27:55.373718 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:55.373724 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:55.373780 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:55.399757 1707070 cri.go:89] found id: ""
	I1124 09:27:55.399774 1707070 logs.go:282] 0 containers: []
	W1124 09:27:55.399783 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:55.399789 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:55.399848 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:55.424773 1707070 cri.go:89] found id: ""
	I1124 09:27:55.424788 1707070 logs.go:282] 0 containers: []
	W1124 09:27:55.424795 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:55.424800 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:55.424862 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:55.450083 1707070 cri.go:89] found id: ""
	I1124 09:27:55.450097 1707070 logs.go:282] 0 containers: []
	W1124 09:27:55.450104 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:55.450112 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:55.450170 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:55.474225 1707070 cri.go:89] found id: ""
	I1124 09:27:55.474239 1707070 logs.go:282] 0 containers: []
	W1124 09:27:55.474247 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:55.474254 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:55.474264 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:55.507455 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:55.507477 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:55.563391 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:55.563414 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:55.583115 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:55.583131 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:55.648979 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:55.641409   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:55.642033   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:55.643543   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:55.644021   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:55.645529   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:55.641409   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:55.642033   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:55.643543   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:55.644021   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:55.645529   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:55.648991 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:55.649004 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:58.210584 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:58.221285 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:58.221351 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:58.250526 1707070 cri.go:89] found id: ""
	I1124 09:27:58.250541 1707070 logs.go:282] 0 containers: []
	W1124 09:27:58.250548 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:58.250554 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:58.250612 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:58.275099 1707070 cri.go:89] found id: ""
	I1124 09:27:58.275116 1707070 logs.go:282] 0 containers: []
	W1124 09:27:58.275123 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:58.275129 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:58.275189 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:58.300058 1707070 cri.go:89] found id: ""
	I1124 09:27:58.300075 1707070 logs.go:282] 0 containers: []
	W1124 09:27:58.300082 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:58.300087 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:58.300148 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:58.323564 1707070 cri.go:89] found id: ""
	I1124 09:27:58.323578 1707070 logs.go:282] 0 containers: []
	W1124 09:27:58.323585 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:58.323591 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:58.323648 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:58.348441 1707070 cri.go:89] found id: ""
	I1124 09:27:58.348455 1707070 logs.go:282] 0 containers: []
	W1124 09:27:58.348463 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:58.348468 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:58.348527 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:58.374283 1707070 cri.go:89] found id: ""
	I1124 09:27:58.374297 1707070 logs.go:282] 0 containers: []
	W1124 09:27:58.374305 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:58.374310 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:58.374371 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:58.400624 1707070 cri.go:89] found id: ""
	I1124 09:27:58.400638 1707070 logs.go:282] 0 containers: []
	W1124 09:27:58.400645 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:58.400653 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:58.400664 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:58.457055 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:58.457075 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:58.474204 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:58.474236 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:58.538738 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:58.530985   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:58.531628   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:58.533238   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:58.533555   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:58.535049   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:58.530985   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:58.531628   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:58.533238   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:58.533555   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:58.535049   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:58.538748 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:58.538761 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:58.601043 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:58.601064 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:01.129158 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:01.152628 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:01.152709 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:01.199688 1707070 cri.go:89] found id: ""
	I1124 09:28:01.199703 1707070 logs.go:282] 0 containers: []
	W1124 09:28:01.199710 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:01.199716 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:01.199778 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:01.226293 1707070 cri.go:89] found id: ""
	I1124 09:28:01.226307 1707070 logs.go:282] 0 containers: []
	W1124 09:28:01.226314 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:01.226319 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:01.226379 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:01.252021 1707070 cri.go:89] found id: ""
	I1124 09:28:01.252036 1707070 logs.go:282] 0 containers: []
	W1124 09:28:01.252043 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:01.252049 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:01.252108 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:01.278563 1707070 cri.go:89] found id: ""
	I1124 09:28:01.278577 1707070 logs.go:282] 0 containers: []
	W1124 09:28:01.278585 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:01.278591 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:01.278697 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:01.304781 1707070 cri.go:89] found id: ""
	I1124 09:28:01.304808 1707070 logs.go:282] 0 containers: []
	W1124 09:28:01.304816 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:01.304822 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:01.304900 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:01.330549 1707070 cri.go:89] found id: ""
	I1124 09:28:01.330574 1707070 logs.go:282] 0 containers: []
	W1124 09:28:01.330581 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:01.330586 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:01.330657 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:01.355624 1707070 cri.go:89] found id: ""
	I1124 09:28:01.355646 1707070 logs.go:282] 0 containers: []
	W1124 09:28:01.355654 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:01.355661 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:01.355673 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:01.411485 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:01.411504 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:01.428912 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:01.428927 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:01.493859 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:01.485758   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:01.486490   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:01.488127   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:01.488656   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:01.490257   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:01.485758   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:01.486490   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:01.488127   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:01.488656   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:01.490257   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:01.493881 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:01.493892 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:01.554787 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:01.554808 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:04.088481 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:04.099124 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:04.099191 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:04.123836 1707070 cri.go:89] found id: ""
	I1124 09:28:04.123849 1707070 logs.go:282] 0 containers: []
	W1124 09:28:04.123857 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:04.123862 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:04.123927 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:04.159485 1707070 cri.go:89] found id: ""
	I1124 09:28:04.159499 1707070 logs.go:282] 0 containers: []
	W1124 09:28:04.159506 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:04.159511 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:04.159572 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:04.187075 1707070 cri.go:89] found id: ""
	I1124 09:28:04.187089 1707070 logs.go:282] 0 containers: []
	W1124 09:28:04.187106 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:04.187112 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:04.187169 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:04.217664 1707070 cri.go:89] found id: ""
	I1124 09:28:04.217677 1707070 logs.go:282] 0 containers: []
	W1124 09:28:04.217696 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:04.217702 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:04.217769 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:04.244060 1707070 cri.go:89] found id: ""
	I1124 09:28:04.244075 1707070 logs.go:282] 0 containers: []
	W1124 09:28:04.244082 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:04.244087 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:04.244151 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:04.269297 1707070 cri.go:89] found id: ""
	I1124 09:28:04.269311 1707070 logs.go:282] 0 containers: []
	W1124 09:28:04.269318 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:04.269323 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:04.269382 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:04.296714 1707070 cri.go:89] found id: ""
	I1124 09:28:04.296730 1707070 logs.go:282] 0 containers: []
	W1124 09:28:04.296737 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:04.296745 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:04.296760 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:04.352538 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:04.352558 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:04.370334 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:04.370357 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:04.439006 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:04.429890   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:04.430808   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:04.432656   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:04.433242   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:04.435153   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:04.429890   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:04.430808   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:04.432656   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:04.433242   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:04.435153   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:04.439018 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:04.439027 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:04.503050 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:04.503072 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:07.038611 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:07.049789 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:07.049861 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:07.074863 1707070 cri.go:89] found id: ""
	I1124 09:28:07.074878 1707070 logs.go:282] 0 containers: []
	W1124 09:28:07.074885 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:07.074893 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:07.074950 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:07.099042 1707070 cri.go:89] found id: ""
	I1124 09:28:07.099057 1707070 logs.go:282] 0 containers: []
	W1124 09:28:07.099064 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:07.099070 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:07.099131 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:07.123608 1707070 cri.go:89] found id: ""
	I1124 09:28:07.123622 1707070 logs.go:282] 0 containers: []
	W1124 09:28:07.123630 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:07.123635 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:07.123706 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:07.151391 1707070 cri.go:89] found id: ""
	I1124 09:28:07.151405 1707070 logs.go:282] 0 containers: []
	W1124 09:28:07.151412 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:07.151418 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:07.151475 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:07.182488 1707070 cri.go:89] found id: ""
	I1124 09:28:07.182502 1707070 logs.go:282] 0 containers: []
	W1124 09:28:07.182510 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:07.182515 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:07.182581 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:07.207523 1707070 cri.go:89] found id: ""
	I1124 09:28:07.207537 1707070 logs.go:282] 0 containers: []
	W1124 09:28:07.207546 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:07.207552 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:07.207614 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:07.233412 1707070 cri.go:89] found id: ""
	I1124 09:28:07.233426 1707070 logs.go:282] 0 containers: []
	W1124 09:28:07.233433 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:07.233441 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:07.233451 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:07.288900 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:07.288922 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:07.306472 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:07.306493 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:07.368097 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:07.360574   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:07.360956   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:07.362483   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:07.362820   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:07.364269   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:07.360574   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:07.360956   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:07.362483   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:07.362820   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:07.364269   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:07.368108 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:07.368121 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:07.429983 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:07.430002 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:09.965289 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:09.976378 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:09.976448 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:10.015687 1707070 cri.go:89] found id: ""
	I1124 09:28:10.015705 1707070 logs.go:282] 0 containers: []
	W1124 09:28:10.015714 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:10.015721 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:10.015811 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:10.042717 1707070 cri.go:89] found id: ""
	I1124 09:28:10.042731 1707070 logs.go:282] 0 containers: []
	W1124 09:28:10.042738 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:10.042743 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:10.042805 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:10.069226 1707070 cri.go:89] found id: ""
	I1124 09:28:10.069240 1707070 logs.go:282] 0 containers: []
	W1124 09:28:10.069259 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:10.069265 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:10.069336 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:10.094576 1707070 cri.go:89] found id: ""
	I1124 09:28:10.094591 1707070 logs.go:282] 0 containers: []
	W1124 09:28:10.094599 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:10.094604 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:10.094683 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:10.120910 1707070 cri.go:89] found id: ""
	I1124 09:28:10.120925 1707070 logs.go:282] 0 containers: []
	W1124 09:28:10.120932 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:10.120938 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:10.121007 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:10.148454 1707070 cri.go:89] found id: ""
	I1124 09:28:10.148467 1707070 logs.go:282] 0 containers: []
	W1124 09:28:10.148476 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:10.148482 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:10.148545 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:10.180342 1707070 cri.go:89] found id: ""
	I1124 09:28:10.180356 1707070 logs.go:282] 0 containers: []
	W1124 09:28:10.180363 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:10.180377 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:10.180387 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:10.237982 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:10.238001 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:10.254875 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:10.254891 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:10.315902 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:10.307876   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:10.308640   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:10.310183   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:10.310727   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:10.312228   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:10.307876   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:10.308640   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:10.310183   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:10.310727   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:10.312228   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:10.315912 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:10.315922 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:10.381257 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:10.381276 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:12.913595 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:12.923674 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:12.923734 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:12.947804 1707070 cri.go:89] found id: ""
	I1124 09:28:12.947818 1707070 logs.go:282] 0 containers: []
	W1124 09:28:12.947826 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:12.947832 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:12.947892 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:12.971923 1707070 cri.go:89] found id: ""
	I1124 09:28:12.971937 1707070 logs.go:282] 0 containers: []
	W1124 09:28:12.971944 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:12.971956 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:12.972017 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:12.996325 1707070 cri.go:89] found id: ""
	I1124 09:28:12.996339 1707070 logs.go:282] 0 containers: []
	W1124 09:28:12.996357 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:12.996364 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:12.996436 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:13.022187 1707070 cri.go:89] found id: ""
	I1124 09:28:13.022203 1707070 logs.go:282] 0 containers: []
	W1124 09:28:13.022211 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:13.022224 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:13.022296 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:13.048161 1707070 cri.go:89] found id: ""
	I1124 09:28:13.048184 1707070 logs.go:282] 0 containers: []
	W1124 09:28:13.048192 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:13.048198 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:13.048262 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:13.073539 1707070 cri.go:89] found id: ""
	I1124 09:28:13.073564 1707070 logs.go:282] 0 containers: []
	W1124 09:28:13.073571 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:13.073578 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:13.073655 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:13.098089 1707070 cri.go:89] found id: ""
	I1124 09:28:13.098106 1707070 logs.go:282] 0 containers: []
	W1124 09:28:13.098114 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:13.098122 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:13.098132 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:13.140239 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:13.140255 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:13.197847 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:13.197865 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:13.217667 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:13.217686 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:13.281312 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:13.272865   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:13.273748   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:13.275370   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:13.275717   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:13.277237   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:13.272865   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:13.273748   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:13.275370   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:13.275717   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:13.277237   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:13.281322 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:13.281334 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:15.842684 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:15.853250 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:15.853311 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:15.878981 1707070 cri.go:89] found id: ""
	I1124 09:28:15.878995 1707070 logs.go:282] 0 containers: []
	W1124 09:28:15.879030 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:15.879036 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:15.879099 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:15.904674 1707070 cri.go:89] found id: ""
	I1124 09:28:15.904687 1707070 logs.go:282] 0 containers: []
	W1124 09:28:15.904695 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:15.904700 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:15.904757 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:15.929766 1707070 cri.go:89] found id: ""
	I1124 09:28:15.929780 1707070 logs.go:282] 0 containers: []
	W1124 09:28:15.929787 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:15.929793 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:15.929851 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:15.955453 1707070 cri.go:89] found id: ""
	I1124 09:28:15.955468 1707070 logs.go:282] 0 containers: []
	W1124 09:28:15.955475 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:15.955485 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:15.955543 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:15.983839 1707070 cri.go:89] found id: ""
	I1124 09:28:15.983854 1707070 logs.go:282] 0 containers: []
	W1124 09:28:15.983861 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:15.983866 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:15.983924 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:16.014730 1707070 cri.go:89] found id: ""
	I1124 09:28:16.014744 1707070 logs.go:282] 0 containers: []
	W1124 09:28:16.014752 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:16.014757 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:16.014820 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:16.046753 1707070 cri.go:89] found id: ""
	I1124 09:28:16.046767 1707070 logs.go:282] 0 containers: []
	W1124 09:28:16.046775 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:16.046783 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:16.046794 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:16.064199 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:16.064217 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:16.139691 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:16.122247   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:16.122923   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:16.124768   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:16.125231   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:16.126838   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:16.122247   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:16.122923   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:16.124768   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:16.125231   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:16.126838   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:16.139701 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:16.139711 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:16.206802 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:16.206822 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:16.234674 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:16.234690 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:18.790282 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:18.801848 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:18.801912 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:18.827821 1707070 cri.go:89] found id: ""
	I1124 09:28:18.827836 1707070 logs.go:282] 0 containers: []
	W1124 09:28:18.827843 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:18.827849 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:18.827905 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:18.852169 1707070 cri.go:89] found id: ""
	I1124 09:28:18.852184 1707070 logs.go:282] 0 containers: []
	W1124 09:28:18.852191 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:18.852196 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:18.852253 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:18.878610 1707070 cri.go:89] found id: ""
	I1124 09:28:18.878625 1707070 logs.go:282] 0 containers: []
	W1124 09:28:18.878633 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:18.878638 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:18.878702 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:18.903384 1707070 cri.go:89] found id: ""
	I1124 09:28:18.903403 1707070 logs.go:282] 0 containers: []
	W1124 09:28:18.903410 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:18.903416 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:18.903476 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:18.928519 1707070 cri.go:89] found id: ""
	I1124 09:28:18.928534 1707070 logs.go:282] 0 containers: []
	W1124 09:28:18.928542 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:18.928547 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:18.928609 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:18.956808 1707070 cri.go:89] found id: ""
	I1124 09:28:18.956823 1707070 logs.go:282] 0 containers: []
	W1124 09:28:18.956830 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:18.956836 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:18.956893 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:18.985113 1707070 cri.go:89] found id: ""
	I1124 09:28:18.985127 1707070 logs.go:282] 0 containers: []
	W1124 09:28:18.985134 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:18.985142 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:18.985152 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:19.019130 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:19.019146 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:19.075193 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:19.075213 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:19.092291 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:19.092306 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:19.162819 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:19.154959   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:19.155361   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:19.156834   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:19.157156   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:19.158629   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:19.154959   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:19.155361   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:19.156834   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:19.157156   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:19.158629   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:19.162839 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:19.162850 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:21.737895 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:21.748053 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:21.748120 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:21.773590 1707070 cri.go:89] found id: ""
	I1124 09:28:21.773604 1707070 logs.go:282] 0 containers: []
	W1124 09:28:21.773611 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:21.773618 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:21.773679 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:21.800809 1707070 cri.go:89] found id: ""
	I1124 09:28:21.800866 1707070 logs.go:282] 0 containers: []
	W1124 09:28:21.800874 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:21.800880 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:21.800938 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:21.826581 1707070 cri.go:89] found id: ""
	I1124 09:28:21.826594 1707070 logs.go:282] 0 containers: []
	W1124 09:28:21.826602 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:21.826607 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:21.826668 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:21.856267 1707070 cri.go:89] found id: ""
	I1124 09:28:21.856282 1707070 logs.go:282] 0 containers: []
	W1124 09:28:21.856289 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:21.856295 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:21.856354 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:21.885138 1707070 cri.go:89] found id: ""
	I1124 09:28:21.885152 1707070 logs.go:282] 0 containers: []
	W1124 09:28:21.885160 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:21.885165 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:21.885224 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:21.909643 1707070 cri.go:89] found id: ""
	I1124 09:28:21.909657 1707070 logs.go:282] 0 containers: []
	W1124 09:28:21.909665 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:21.909671 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:21.909727 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:21.936792 1707070 cri.go:89] found id: ""
	I1124 09:28:21.936806 1707070 logs.go:282] 0 containers: []
	W1124 09:28:21.936813 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:21.936821 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:21.936831 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:21.993870 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:21.993890 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:22.011453 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:22.011474 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:22.078376 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:22.069998   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:22.070791   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:22.072423   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:22.073020   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:22.074616   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:22.069998   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:22.070791   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:22.072423   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:22.073020   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:22.074616   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:22.078387 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:22.078398 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:22.140934 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:22.140953 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:24.669313 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:24.679257 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:24.679328 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:24.707632 1707070 cri.go:89] found id: ""
	I1124 09:28:24.707647 1707070 logs.go:282] 0 containers: []
	W1124 09:28:24.707654 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:24.707660 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:24.707720 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:24.733688 1707070 cri.go:89] found id: ""
	I1124 09:28:24.733702 1707070 logs.go:282] 0 containers: []
	W1124 09:28:24.733710 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:24.733715 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:24.733773 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:24.759056 1707070 cri.go:89] found id: ""
	I1124 09:28:24.759071 1707070 logs.go:282] 0 containers: []
	W1124 09:28:24.759078 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:24.759084 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:24.759143 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:24.789918 1707070 cri.go:89] found id: ""
	I1124 09:28:24.789931 1707070 logs.go:282] 0 containers: []
	W1124 09:28:24.789938 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:24.789944 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:24.790003 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:24.814684 1707070 cri.go:89] found id: ""
	I1124 09:28:24.814698 1707070 logs.go:282] 0 containers: []
	W1124 09:28:24.814709 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:24.814714 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:24.814773 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:24.839467 1707070 cri.go:89] found id: ""
	I1124 09:28:24.839489 1707070 logs.go:282] 0 containers: []
	W1124 09:28:24.839497 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:24.839503 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:24.839568 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:24.863902 1707070 cri.go:89] found id: ""
	I1124 09:28:24.863917 1707070 logs.go:282] 0 containers: []
	W1124 09:28:24.863925 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:24.863933 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:24.863943 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:24.919300 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:24.919320 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:24.936150 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:24.936167 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:24.998414 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:24.990181   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:24.990882   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:24.992541   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:24.993206   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:24.994900   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:24.990181   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:24.990882   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:24.992541   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:24.993206   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:24.994900   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:24.998425 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:24.998435 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:25.062735 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:25.062756 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:27.591381 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:27.601598 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:27.601658 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:27.626062 1707070 cri.go:89] found id: ""
	I1124 09:28:27.626076 1707070 logs.go:282] 0 containers: []
	W1124 09:28:27.626084 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:27.626090 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:27.626152 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:27.654571 1707070 cri.go:89] found id: ""
	I1124 09:28:27.654591 1707070 logs.go:282] 0 containers: []
	W1124 09:28:27.654599 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:27.654604 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:27.654664 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:27.679294 1707070 cri.go:89] found id: ""
	I1124 09:28:27.679308 1707070 logs.go:282] 0 containers: []
	W1124 09:28:27.679315 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:27.679320 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:27.679377 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:27.702575 1707070 cri.go:89] found id: ""
	I1124 09:28:27.702588 1707070 logs.go:282] 0 containers: []
	W1124 09:28:27.702595 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:27.702601 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:27.702657 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:27.728251 1707070 cri.go:89] found id: ""
	I1124 09:28:27.728266 1707070 logs.go:282] 0 containers: []
	W1124 09:28:27.728273 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:27.728279 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:27.728339 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:27.752789 1707070 cri.go:89] found id: ""
	I1124 09:28:27.752802 1707070 logs.go:282] 0 containers: []
	W1124 09:28:27.752809 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:27.752815 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:27.752874 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:27.776833 1707070 cri.go:89] found id: ""
	I1124 09:28:27.776847 1707070 logs.go:282] 0 containers: []
	W1124 09:28:27.776854 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:27.776862 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:27.776871 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:27.837612 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:27.837637 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:27.866873 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:27.866890 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:27.925473 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:27.925492 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:27.942415 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:27.942432 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:28.014797 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:27.999267   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:28.000058   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:28.002028   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:28.002995   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:28.005197   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:27.999267   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:28.000058   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:28.002028   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:28.002995   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:28.005197   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:30.515707 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:30.526026 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:30.526102 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:30.550904 1707070 cri.go:89] found id: ""
	I1124 09:28:30.550918 1707070 logs.go:282] 0 containers: []
	W1124 09:28:30.550925 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:30.550931 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:30.550996 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:30.580837 1707070 cri.go:89] found id: ""
	I1124 09:28:30.580851 1707070 logs.go:282] 0 containers: []
	W1124 09:28:30.580859 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:30.580864 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:30.580920 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:30.605291 1707070 cri.go:89] found id: ""
	I1124 09:28:30.605305 1707070 logs.go:282] 0 containers: []
	W1124 09:28:30.605312 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:30.605318 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:30.605376 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:30.630158 1707070 cri.go:89] found id: ""
	I1124 09:28:30.630172 1707070 logs.go:282] 0 containers: []
	W1124 09:28:30.630181 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:30.630187 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:30.630254 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:30.653754 1707070 cri.go:89] found id: ""
	I1124 09:28:30.653772 1707070 logs.go:282] 0 containers: []
	W1124 09:28:30.653785 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:30.653790 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:30.653868 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:30.679137 1707070 cri.go:89] found id: ""
	I1124 09:28:30.679150 1707070 logs.go:282] 0 containers: []
	W1124 09:28:30.679157 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:30.679163 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:30.679221 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:30.703850 1707070 cri.go:89] found id: ""
	I1124 09:28:30.703864 1707070 logs.go:282] 0 containers: []
	W1124 09:28:30.703871 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:30.703879 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:30.703888 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:30.772547 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:30.764218   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:30.764926   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:30.766593   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:30.767134   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:30.768991   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:30.764218   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:30.764926   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:30.766593   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:30.767134   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:30.768991   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:30.772557 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:30.772568 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:30.834024 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:30.834043 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:30.862031 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:30.862046 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:30.920292 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:30.920311 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:33.438606 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:33.448762 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:33.448822 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:33.481032 1707070 cri.go:89] found id: ""
	I1124 09:28:33.481046 1707070 logs.go:282] 0 containers: []
	W1124 09:28:33.481053 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:33.481060 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:33.481117 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:33.504561 1707070 cri.go:89] found id: ""
	I1124 09:28:33.504576 1707070 logs.go:282] 0 containers: []
	W1124 09:28:33.504583 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:33.504589 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:33.504654 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:33.528885 1707070 cri.go:89] found id: ""
	I1124 09:28:33.528899 1707070 logs.go:282] 0 containers: []
	W1124 09:28:33.528906 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:33.528915 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:33.528972 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:33.553244 1707070 cri.go:89] found id: ""
	I1124 09:28:33.553258 1707070 logs.go:282] 0 containers: []
	W1124 09:28:33.553271 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:33.553277 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:33.553334 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:33.578519 1707070 cri.go:89] found id: ""
	I1124 09:28:33.578533 1707070 logs.go:282] 0 containers: []
	W1124 09:28:33.578541 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:33.578546 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:33.578607 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:33.602708 1707070 cri.go:89] found id: ""
	I1124 09:28:33.602721 1707070 logs.go:282] 0 containers: []
	W1124 09:28:33.602729 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:33.602734 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:33.602791 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:33.626894 1707070 cri.go:89] found id: ""
	I1124 09:28:33.626908 1707070 logs.go:282] 0 containers: []
	W1124 09:28:33.626916 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:33.626923 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:33.626934 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:33.684867 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:33.684887 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:33.701817 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:33.701834 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:33.775161 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:33.766757   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:33.767480   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:33.769022   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:33.769484   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:33.770951   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:33.766757   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:33.767480   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:33.769022   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:33.769484   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:33.770951   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:33.775172 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:33.775185 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:33.837667 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:33.837688 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:36.365266 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:36.376558 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:36.376622 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:36.412692 1707070 cri.go:89] found id: ""
	I1124 09:28:36.412706 1707070 logs.go:282] 0 containers: []
	W1124 09:28:36.412714 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:36.412719 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:36.412777 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:36.448943 1707070 cri.go:89] found id: ""
	I1124 09:28:36.448957 1707070 logs.go:282] 0 containers: []
	W1124 09:28:36.448964 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:36.448970 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:36.449031 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:36.474906 1707070 cri.go:89] found id: ""
	I1124 09:28:36.474920 1707070 logs.go:282] 0 containers: []
	W1124 09:28:36.474928 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:36.474934 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:36.474990 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:36.503770 1707070 cri.go:89] found id: ""
	I1124 09:28:36.503784 1707070 logs.go:282] 0 containers: []
	W1124 09:28:36.503792 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:36.503797 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:36.503863 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:36.532858 1707070 cri.go:89] found id: ""
	I1124 09:28:36.532872 1707070 logs.go:282] 0 containers: []
	W1124 09:28:36.532880 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:36.532885 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:36.532944 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:36.557874 1707070 cri.go:89] found id: ""
	I1124 09:28:36.557889 1707070 logs.go:282] 0 containers: []
	W1124 09:28:36.557896 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:36.557902 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:36.557959 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:36.582175 1707070 cri.go:89] found id: ""
	I1124 09:28:36.582189 1707070 logs.go:282] 0 containers: []
	W1124 09:28:36.582204 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:36.582212 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:36.582230 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:36.645586 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:36.637487   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:36.638140   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:36.639873   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:36.640429   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:36.641968   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:36.637487   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:36.638140   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:36.639873   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:36.640429   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:36.641968   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:36.645596 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:36.645607 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:36.708211 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:36.708231 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:36.740877 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:36.740894 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:36.798376 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:36.798396 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:39.316746 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:39.327050 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:39.327111 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:39.351416 1707070 cri.go:89] found id: ""
	I1124 09:28:39.351430 1707070 logs.go:282] 0 containers: []
	W1124 09:28:39.351438 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:39.351444 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:39.351500 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:39.375341 1707070 cri.go:89] found id: ""
	I1124 09:28:39.375355 1707070 logs.go:282] 0 containers: []
	W1124 09:28:39.375362 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:39.375367 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:39.375425 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:39.402220 1707070 cri.go:89] found id: ""
	I1124 09:28:39.402235 1707070 logs.go:282] 0 containers: []
	W1124 09:28:39.402241 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:39.402247 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:39.402306 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:39.434081 1707070 cri.go:89] found id: ""
	I1124 09:28:39.434094 1707070 logs.go:282] 0 containers: []
	W1124 09:28:39.434101 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:39.434107 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:39.434167 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:39.467514 1707070 cri.go:89] found id: ""
	I1124 09:28:39.467528 1707070 logs.go:282] 0 containers: []
	W1124 09:28:39.467535 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:39.467540 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:39.467597 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:39.500947 1707070 cri.go:89] found id: ""
	I1124 09:28:39.500961 1707070 logs.go:282] 0 containers: []
	W1124 09:28:39.500968 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:39.500974 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:39.501034 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:39.526637 1707070 cri.go:89] found id: ""
	I1124 09:28:39.526651 1707070 logs.go:282] 0 containers: []
	W1124 09:28:39.526658 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:39.526666 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:39.526676 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:39.582247 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:39.582268 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:39.599751 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:39.599767 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:39.668271 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:39.660949   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:39.661446   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:39.663149   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:39.663643   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:39.664706   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:39.660949   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:39.661446   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:39.663149   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:39.663643   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:39.664706   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:39.668281 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:39.668294 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:39.730931 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:39.730951 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:42.260305 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:42.272405 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:42.272489 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:42.300817 1707070 cri.go:89] found id: ""
	I1124 09:28:42.300842 1707070 logs.go:282] 0 containers: []
	W1124 09:28:42.300850 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:42.300856 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:42.300921 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:42.327350 1707070 cri.go:89] found id: ""
	I1124 09:28:42.327368 1707070 logs.go:282] 0 containers: []
	W1124 09:28:42.327377 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:42.327382 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:42.327441 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:42.352768 1707070 cri.go:89] found id: ""
	I1124 09:28:42.352781 1707070 logs.go:282] 0 containers: []
	W1124 09:28:42.352788 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:42.352794 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:42.352858 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:42.384996 1707070 cri.go:89] found id: ""
	I1124 09:28:42.385016 1707070 logs.go:282] 0 containers: []
	W1124 09:28:42.385024 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:42.385035 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:42.385109 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:42.433916 1707070 cri.go:89] found id: ""
	I1124 09:28:42.433942 1707070 logs.go:282] 0 containers: []
	W1124 09:28:42.433963 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:42.433974 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:42.434041 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:42.469962 1707070 cri.go:89] found id: ""
	I1124 09:28:42.469976 1707070 logs.go:282] 0 containers: []
	W1124 09:28:42.469983 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:42.469989 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:42.470045 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:42.494905 1707070 cri.go:89] found id: ""
	I1124 09:28:42.494919 1707070 logs.go:282] 0 containers: []
	W1124 09:28:42.494926 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:42.494934 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:42.494944 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:42.551276 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:42.551295 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:42.568521 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:42.568538 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:42.631652 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:42.623578   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:42.624203   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:42.625718   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:42.626134   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:42.627653   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:42.623578   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:42.624203   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:42.625718   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:42.626134   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:42.627653   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:42.631662 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:42.631689 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:42.697554 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:42.697573 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:45.228012 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:45.242540 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:45.242663 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:45.285651 1707070 cri.go:89] found id: ""
	I1124 09:28:45.285666 1707070 logs.go:282] 0 containers: []
	W1124 09:28:45.285673 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:45.285679 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:45.285747 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:45.315729 1707070 cri.go:89] found id: ""
	I1124 09:28:45.315744 1707070 logs.go:282] 0 containers: []
	W1124 09:28:45.315759 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:45.315766 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:45.315838 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:45.342027 1707070 cri.go:89] found id: ""
	I1124 09:28:45.342041 1707070 logs.go:282] 0 containers: []
	W1124 09:28:45.342048 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:45.342053 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:45.342112 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:45.368019 1707070 cri.go:89] found id: ""
	I1124 09:28:45.368033 1707070 logs.go:282] 0 containers: []
	W1124 09:28:45.368040 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:45.368046 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:45.368102 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:45.406091 1707070 cri.go:89] found id: ""
	I1124 09:28:45.406104 1707070 logs.go:282] 0 containers: []
	W1124 09:28:45.406112 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:45.406119 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:45.406176 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:45.432356 1707070 cri.go:89] found id: ""
	I1124 09:28:45.432369 1707070 logs.go:282] 0 containers: []
	W1124 09:28:45.432377 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:45.432382 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:45.432449 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:45.465291 1707070 cri.go:89] found id: ""
	I1124 09:28:45.465315 1707070 logs.go:282] 0 containers: []
	W1124 09:28:45.465324 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:45.465332 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:45.465345 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:45.527756 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:45.527784 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:45.544616 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:45.544642 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:45.606842 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:45.598345   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:45.599427   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:45.600949   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:45.601550   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:45.603105   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:45.598345   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:45.599427   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:45.600949   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:45.601550   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:45.603105   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:45.606853 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:45.606866 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:45.669056 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:45.669077 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:48.198708 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:48.210384 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:48.210449 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:48.235268 1707070 cri.go:89] found id: ""
	I1124 09:28:48.235282 1707070 logs.go:282] 0 containers: []
	W1124 09:28:48.235289 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:48.235295 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:48.235357 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:48.261413 1707070 cri.go:89] found id: ""
	I1124 09:28:48.261427 1707070 logs.go:282] 0 containers: []
	W1124 09:28:48.261434 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:48.261439 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:48.261496 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:48.291100 1707070 cri.go:89] found id: ""
	I1124 09:28:48.291114 1707070 logs.go:282] 0 containers: []
	W1124 09:28:48.291122 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:48.291127 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:48.291186 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:48.326388 1707070 cri.go:89] found id: ""
	I1124 09:28:48.326412 1707070 logs.go:282] 0 containers: []
	W1124 09:28:48.326420 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:48.326426 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:48.326499 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:48.356212 1707070 cri.go:89] found id: ""
	I1124 09:28:48.356227 1707070 logs.go:282] 0 containers: []
	W1124 09:28:48.356234 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:48.356240 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:48.356299 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:48.384677 1707070 cri.go:89] found id: ""
	I1124 09:28:48.384690 1707070 logs.go:282] 0 containers: []
	W1124 09:28:48.384697 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:48.384703 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:48.384759 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:48.422001 1707070 cri.go:89] found id: ""
	I1124 09:28:48.422015 1707070 logs.go:282] 0 containers: []
	W1124 09:28:48.422022 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:48.422030 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:48.422040 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:48.492980 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:48.493001 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:48.522367 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:48.522383 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:48.577847 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:48.577866 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:48.594803 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:48.594821 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:48.662402 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:48.654176   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:48.655485   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:48.656131   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:48.657084   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:48.658755   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:48.654176   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:48.655485   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:48.656131   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:48.657084   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:48.658755   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:51.162680 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:51.173802 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:51.173865 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:51.200124 1707070 cri.go:89] found id: ""
	I1124 09:28:51.200146 1707070 logs.go:282] 0 containers: []
	W1124 09:28:51.200155 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:51.200161 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:51.200220 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:51.225309 1707070 cri.go:89] found id: ""
	I1124 09:28:51.225323 1707070 logs.go:282] 0 containers: []
	W1124 09:28:51.225330 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:51.225335 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:51.225392 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:51.249971 1707070 cri.go:89] found id: ""
	I1124 09:28:51.249985 1707070 logs.go:282] 0 containers: []
	W1124 09:28:51.249992 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:51.249997 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:51.250053 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:51.275848 1707070 cri.go:89] found id: ""
	I1124 09:28:51.275861 1707070 logs.go:282] 0 containers: []
	W1124 09:28:51.275868 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:51.275874 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:51.275929 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:51.304356 1707070 cri.go:89] found id: ""
	I1124 09:28:51.304370 1707070 logs.go:282] 0 containers: []
	W1124 09:28:51.304386 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:51.304392 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:51.304450 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:51.329000 1707070 cri.go:89] found id: ""
	I1124 09:28:51.329015 1707070 logs.go:282] 0 containers: []
	W1124 09:28:51.329021 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:51.329027 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:51.329099 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:51.357783 1707070 cri.go:89] found id: ""
	I1124 09:28:51.357796 1707070 logs.go:282] 0 containers: []
	W1124 09:28:51.357804 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:51.357811 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:51.357820 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:51.426561 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:51.426582 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:51.456185 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:51.456202 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:51.512504 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:51.512525 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:51.530860 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:51.530877 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:51.596556 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:51.586703   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:51.587508   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:51.589233   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:51.589675   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:51.591800   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:51.586703   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:51.587508   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:51.589233   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:51.589675   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:51.591800   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:54.097448 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:54.107646 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:54.107710 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:54.131850 1707070 cri.go:89] found id: ""
	I1124 09:28:54.131869 1707070 logs.go:282] 0 containers: []
	W1124 09:28:54.131877 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:54.131883 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:54.131950 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:54.157778 1707070 cri.go:89] found id: ""
	I1124 09:28:54.157793 1707070 logs.go:282] 0 containers: []
	W1124 09:28:54.157800 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:54.157806 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:54.157871 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:54.183638 1707070 cri.go:89] found id: ""
	I1124 09:28:54.183661 1707070 logs.go:282] 0 containers: []
	W1124 09:28:54.183668 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:54.183676 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:54.183745 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:54.208654 1707070 cri.go:89] found id: ""
	I1124 09:28:54.208668 1707070 logs.go:282] 0 containers: []
	W1124 09:28:54.208675 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:54.208680 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:54.208741 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:54.237302 1707070 cri.go:89] found id: ""
	I1124 09:28:54.237317 1707070 logs.go:282] 0 containers: []
	W1124 09:28:54.237325 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:54.237331 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:54.237390 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:54.261089 1707070 cri.go:89] found id: ""
	I1124 09:28:54.261111 1707070 logs.go:282] 0 containers: []
	W1124 09:28:54.261119 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:54.261124 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:54.261195 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:54.289315 1707070 cri.go:89] found id: ""
	I1124 09:28:54.289337 1707070 logs.go:282] 0 containers: []
	W1124 09:28:54.289345 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:54.289353 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:54.289363 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:54.350840 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:54.350861 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:54.391880 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:54.391897 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:54.457044 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:54.457066 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:54.475507 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:54.475525 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:54.538358 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:54.529952   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:54.530805   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:54.531583   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:54.533115   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:54.533777   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:54.529952   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:54.530805   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:54.531583   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:54.533115   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:54.533777   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:57.040068 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:57.050642 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:57.050707 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:57.075811 1707070 cri.go:89] found id: ""
	I1124 09:28:57.075824 1707070 logs.go:282] 0 containers: []
	W1124 09:28:57.075832 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:57.075837 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:57.075899 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:57.106029 1707070 cri.go:89] found id: ""
	I1124 09:28:57.106044 1707070 logs.go:282] 0 containers: []
	W1124 09:28:57.106052 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:57.106058 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:57.106114 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:57.132742 1707070 cri.go:89] found id: ""
	I1124 09:28:57.132756 1707070 logs.go:282] 0 containers: []
	W1124 09:28:57.132763 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:57.132768 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:57.132825 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:57.156809 1707070 cri.go:89] found id: ""
	I1124 09:28:57.156823 1707070 logs.go:282] 0 containers: []
	W1124 09:28:57.156830 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:57.156835 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:57.156898 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:57.182649 1707070 cri.go:89] found id: ""
	I1124 09:28:57.182663 1707070 logs.go:282] 0 containers: []
	W1124 09:28:57.182670 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:57.182676 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:57.182733 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:57.206184 1707070 cri.go:89] found id: ""
	I1124 09:28:57.206198 1707070 logs.go:282] 0 containers: []
	W1124 09:28:57.206205 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:57.206211 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:57.206275 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:57.230629 1707070 cri.go:89] found id: ""
	I1124 09:28:57.230643 1707070 logs.go:282] 0 containers: []
	W1124 09:28:57.230651 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:57.230660 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:57.230670 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:57.287168 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:57.287187 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:57.304021 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:57.304037 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:57.368613 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:57.361126   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:57.361623   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:57.363259   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:57.363659   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:57.365140   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:57.361126   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:57.361623   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:57.363259   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:57.363659   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:57.365140   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:57.368624 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:57.368635 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:57.439834 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:57.439854 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:59.971306 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:59.982006 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:59.982066 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:00.016934 1707070 cri.go:89] found id: ""
	I1124 09:29:00.016951 1707070 logs.go:282] 0 containers: []
	W1124 09:29:00.016966 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:00.016973 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:00.017049 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:00.103638 1707070 cri.go:89] found id: ""
	I1124 09:29:00.103654 1707070 logs.go:282] 0 containers: []
	W1124 09:29:00.103663 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:00.103669 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:00.103740 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:00.170246 1707070 cri.go:89] found id: ""
	I1124 09:29:00.170264 1707070 logs.go:282] 0 containers: []
	W1124 09:29:00.170273 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:00.170280 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:00.170350 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:00.236365 1707070 cri.go:89] found id: ""
	I1124 09:29:00.236382 1707070 logs.go:282] 0 containers: []
	W1124 09:29:00.236390 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:00.236397 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:00.236474 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:00.304007 1707070 cri.go:89] found id: ""
	I1124 09:29:00.304026 1707070 logs.go:282] 0 containers: []
	W1124 09:29:00.304036 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:00.304048 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:00.304139 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:00.347892 1707070 cri.go:89] found id: ""
	I1124 09:29:00.347907 1707070 logs.go:282] 0 containers: []
	W1124 09:29:00.347916 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:00.347924 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:00.348047 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:00.392276 1707070 cri.go:89] found id: ""
	I1124 09:29:00.392292 1707070 logs.go:282] 0 containers: []
	W1124 09:29:00.392304 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:00.392314 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:00.392328 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:00.445097 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:00.445118 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:00.507903 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:00.507923 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:00.532762 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:00.532787 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:00.603329 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:00.595058   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:00.595595   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:00.597748   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:00.598425   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:00.599635   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:00.595058   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:00.595595   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:00.597748   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:00.598425   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:00.599635   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:00.603341 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:00.603352 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:03.164630 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:03.174868 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:03.174928 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:03.198952 1707070 cri.go:89] found id: ""
	I1124 09:29:03.198966 1707070 logs.go:282] 0 containers: []
	W1124 09:29:03.198973 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:03.198979 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:03.199038 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:03.228049 1707070 cri.go:89] found id: ""
	I1124 09:29:03.228063 1707070 logs.go:282] 0 containers: []
	W1124 09:29:03.228070 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:03.228075 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:03.228133 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:03.253873 1707070 cri.go:89] found id: ""
	I1124 09:29:03.253888 1707070 logs.go:282] 0 containers: []
	W1124 09:29:03.253895 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:03.253901 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:03.253969 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:03.277874 1707070 cri.go:89] found id: ""
	I1124 09:29:03.277889 1707070 logs.go:282] 0 containers: []
	W1124 09:29:03.277903 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:03.277909 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:03.277966 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:03.306311 1707070 cri.go:89] found id: ""
	I1124 09:29:03.306333 1707070 logs.go:282] 0 containers: []
	W1124 09:29:03.306340 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:03.306345 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:03.306402 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:03.330412 1707070 cri.go:89] found id: ""
	I1124 09:29:03.330425 1707070 logs.go:282] 0 containers: []
	W1124 09:29:03.330432 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:03.330438 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:03.330572 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:03.359087 1707070 cri.go:89] found id: ""
	I1124 09:29:03.359101 1707070 logs.go:282] 0 containers: []
	W1124 09:29:03.359108 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:03.359116 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:03.359125 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:03.430996 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:03.431015 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:03.467444 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:03.467460 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:03.526316 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:03.526336 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:03.543233 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:03.543250 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:03.605146 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:03.596435   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:03.597161   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:03.598917   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:03.599598   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:03.601425   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:03.596435   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:03.597161   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:03.598917   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:03.599598   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:03.601425   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:06.105406 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:06.116034 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:06.116093 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:06.140111 1707070 cri.go:89] found id: ""
	I1124 09:29:06.140125 1707070 logs.go:282] 0 containers: []
	W1124 09:29:06.140132 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:06.140137 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:06.140195 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:06.164893 1707070 cri.go:89] found id: ""
	I1124 09:29:06.164907 1707070 logs.go:282] 0 containers: []
	W1124 09:29:06.164914 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:06.164920 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:06.164979 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:06.190122 1707070 cri.go:89] found id: ""
	I1124 09:29:06.190137 1707070 logs.go:282] 0 containers: []
	W1124 09:29:06.190144 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:06.190149 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:06.190206 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:06.215548 1707070 cri.go:89] found id: ""
	I1124 09:29:06.215562 1707070 logs.go:282] 0 containers: []
	W1124 09:29:06.215569 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:06.215575 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:06.215630 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:06.239566 1707070 cri.go:89] found id: ""
	I1124 09:29:06.239592 1707070 logs.go:282] 0 containers: []
	W1124 09:29:06.239600 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:06.239605 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:06.239662 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:06.266190 1707070 cri.go:89] found id: ""
	I1124 09:29:06.266223 1707070 logs.go:282] 0 containers: []
	W1124 09:29:06.266232 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:06.266237 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:06.266301 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:06.289910 1707070 cri.go:89] found id: ""
	I1124 09:29:06.289923 1707070 logs.go:282] 0 containers: []
	W1124 09:29:06.289930 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:06.289939 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:06.289955 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:06.353044 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:06.345412   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:06.345855   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:06.347499   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:06.347885   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:06.349511   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:06.345412   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:06.345855   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:06.347499   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:06.347885   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:06.349511   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:06.353054 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:06.353068 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:06.420094 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:06.420114 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:06.452708 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:06.452724 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:06.508689 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:06.508708 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:09.026433 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:09.036862 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:09.036926 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:09.061951 1707070 cri.go:89] found id: ""
	I1124 09:29:09.061965 1707070 logs.go:282] 0 containers: []
	W1124 09:29:09.061972 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:09.061977 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:09.062035 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:09.087954 1707070 cri.go:89] found id: ""
	I1124 09:29:09.087968 1707070 logs.go:282] 0 containers: []
	W1124 09:29:09.087976 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:09.087981 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:09.088044 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:09.112784 1707070 cri.go:89] found id: ""
	I1124 09:29:09.112798 1707070 logs.go:282] 0 containers: []
	W1124 09:29:09.112805 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:09.112810 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:09.112869 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:09.137324 1707070 cri.go:89] found id: ""
	I1124 09:29:09.137339 1707070 logs.go:282] 0 containers: []
	W1124 09:29:09.137347 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:09.137353 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:09.137413 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:09.162408 1707070 cri.go:89] found id: ""
	I1124 09:29:09.162422 1707070 logs.go:282] 0 containers: []
	W1124 09:29:09.162430 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:09.162435 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:09.162513 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:09.191279 1707070 cri.go:89] found id: ""
	I1124 09:29:09.191293 1707070 logs.go:282] 0 containers: []
	W1124 09:29:09.191300 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:09.191305 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:09.191361 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:09.214616 1707070 cri.go:89] found id: ""
	I1124 09:29:09.214630 1707070 logs.go:282] 0 containers: []
	W1124 09:29:09.214637 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:09.214645 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:09.214657 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:09.270146 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:09.270164 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:09.287320 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:09.287340 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:09.352488 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:09.344015   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:09.344642   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:09.346617   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:09.347280   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:09.348952   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:09.344015   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:09.344642   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:09.346617   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:09.347280   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:09.348952   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:09.352499 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:09.352510 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:09.418511 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:09.418532 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:11.954969 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:11.967024 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:11.967089 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:11.990717 1707070 cri.go:89] found id: ""
	I1124 09:29:11.990733 1707070 logs.go:282] 0 containers: []
	W1124 09:29:11.990741 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:11.990746 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:11.990809 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:12.020399 1707070 cri.go:89] found id: ""
	I1124 09:29:12.020413 1707070 logs.go:282] 0 containers: []
	W1124 09:29:12.020421 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:12.020427 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:12.020495 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:12.047081 1707070 cri.go:89] found id: ""
	I1124 09:29:12.047105 1707070 logs.go:282] 0 containers: []
	W1124 09:29:12.047114 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:12.047120 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:12.047185 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:12.072046 1707070 cri.go:89] found id: ""
	I1124 09:29:12.072060 1707070 logs.go:282] 0 containers: []
	W1124 09:29:12.072068 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:12.072074 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:12.072131 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:12.103533 1707070 cri.go:89] found id: ""
	I1124 09:29:12.103547 1707070 logs.go:282] 0 containers: []
	W1124 09:29:12.103554 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:12.103559 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:12.103619 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:12.131885 1707070 cri.go:89] found id: ""
	I1124 09:29:12.131900 1707070 logs.go:282] 0 containers: []
	W1124 09:29:12.131908 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:12.131914 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:12.131977 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:12.156166 1707070 cri.go:89] found id: ""
	I1124 09:29:12.156180 1707070 logs.go:282] 0 containers: []
	W1124 09:29:12.156187 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:12.156195 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:12.156206 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:12.184115 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:12.184131 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:12.239534 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:12.239553 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:12.256920 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:12.256937 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:12.322513 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:12.315053   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:12.315552   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:12.317173   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:12.317659   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:12.319113   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:12.315053   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:12.315552   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:12.317173   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:12.317659   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:12.319113   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:12.322536 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:12.322546 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:14.891198 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:14.901386 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:14.901446 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:14.926318 1707070 cri.go:89] found id: ""
	I1124 09:29:14.926340 1707070 logs.go:282] 0 containers: []
	W1124 09:29:14.926347 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:14.926353 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:14.926413 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:14.955083 1707070 cri.go:89] found id: ""
	I1124 09:29:14.955097 1707070 logs.go:282] 0 containers: []
	W1124 09:29:14.955104 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:14.955110 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:14.955167 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:14.979745 1707070 cri.go:89] found id: ""
	I1124 09:29:14.979758 1707070 logs.go:282] 0 containers: []
	W1124 09:29:14.979766 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:14.979771 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:14.979829 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:15.004845 1707070 cri.go:89] found id: ""
	I1124 09:29:15.004861 1707070 logs.go:282] 0 containers: []
	W1124 09:29:15.004869 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:15.004875 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:15.004952 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:15.044211 1707070 cri.go:89] found id: ""
	I1124 09:29:15.044225 1707070 logs.go:282] 0 containers: []
	W1124 09:29:15.044237 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:15.044243 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:15.044330 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:15.075656 1707070 cri.go:89] found id: ""
	I1124 09:29:15.075669 1707070 logs.go:282] 0 containers: []
	W1124 09:29:15.075677 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:15.075682 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:15.075740 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:15.101378 1707070 cri.go:89] found id: ""
	I1124 09:29:15.101392 1707070 logs.go:282] 0 containers: []
	W1124 09:29:15.101400 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:15.101408 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:15.101418 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:15.159297 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:15.159316 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:15.176523 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:15.176541 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:15.242899 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:15.234359   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:15.235294   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:15.237104   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:15.237675   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:15.239169   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:15.234359   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:15.235294   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:15.237104   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:15.237675   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:15.239169   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:15.242909 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:15.242919 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:15.304297 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:15.304319 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:17.833530 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:17.843418 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:17.843476 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:17.867779 1707070 cri.go:89] found id: ""
	I1124 09:29:17.867793 1707070 logs.go:282] 0 containers: []
	W1124 09:29:17.867806 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:17.867811 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:17.867866 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:17.891077 1707070 cri.go:89] found id: ""
	I1124 09:29:17.891090 1707070 logs.go:282] 0 containers: []
	W1124 09:29:17.891098 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:17.891103 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:17.891187 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:17.915275 1707070 cri.go:89] found id: ""
	I1124 09:29:17.915289 1707070 logs.go:282] 0 containers: []
	W1124 09:29:17.915296 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:17.915301 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:17.915357 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:17.943098 1707070 cri.go:89] found id: ""
	I1124 09:29:17.943111 1707070 logs.go:282] 0 containers: []
	W1124 09:29:17.943119 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:17.943124 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:17.943186 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:17.968417 1707070 cri.go:89] found id: ""
	I1124 09:29:17.968430 1707070 logs.go:282] 0 containers: []
	W1124 09:29:17.968437 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:17.968443 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:17.968501 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:17.993301 1707070 cri.go:89] found id: ""
	I1124 09:29:17.993315 1707070 logs.go:282] 0 containers: []
	W1124 09:29:17.993322 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:17.993328 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:17.993385 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:18.021715 1707070 cri.go:89] found id: ""
	I1124 09:29:18.021730 1707070 logs.go:282] 0 containers: []
	W1124 09:29:18.021738 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:18.021746 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:18.021756 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:18.085324 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:18.085345 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:18.118128 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:18.118159 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:18.182148 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:18.182171 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:18.199970 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:18.199990 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:18.266928 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:18.258137   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:18.258818   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:18.260418   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:18.261036   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:18.262678   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:18.258137   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:18.258818   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:18.260418   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:18.261036   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:18.262678   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:20.768145 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:20.780890 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:20.780956 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:20.807227 1707070 cri.go:89] found id: ""
	I1124 09:29:20.807241 1707070 logs.go:282] 0 containers: []
	W1124 09:29:20.807248 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:20.807253 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:20.807317 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:20.836452 1707070 cri.go:89] found id: ""
	I1124 09:29:20.836466 1707070 logs.go:282] 0 containers: []
	W1124 09:29:20.836473 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:20.836478 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:20.836535 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:20.861534 1707070 cri.go:89] found id: ""
	I1124 09:29:20.861549 1707070 logs.go:282] 0 containers: []
	W1124 09:29:20.861556 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:20.861561 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:20.861620 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:20.890181 1707070 cri.go:89] found id: ""
	I1124 09:29:20.890196 1707070 logs.go:282] 0 containers: []
	W1124 09:29:20.890203 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:20.890209 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:20.890278 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:20.919882 1707070 cri.go:89] found id: ""
	I1124 09:29:20.919897 1707070 logs.go:282] 0 containers: []
	W1124 09:29:20.919904 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:20.919910 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:20.919973 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:20.948347 1707070 cri.go:89] found id: ""
	I1124 09:29:20.948361 1707070 logs.go:282] 0 containers: []
	W1124 09:29:20.948368 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:20.948373 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:20.948428 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:20.972834 1707070 cri.go:89] found id: ""
	I1124 09:29:20.972847 1707070 logs.go:282] 0 containers: []
	W1124 09:29:20.972855 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:20.972862 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:20.972873 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:21.029330 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:21.029350 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:21.046983 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:21.047000 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:21.112004 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:21.104171   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:21.104918   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:21.106573   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:21.107127   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:21.108653   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:21.104171   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:21.104918   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:21.106573   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:21.107127   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:21.108653   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:21.112015 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:21.112025 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:21.174850 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:21.174870 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:23.702609 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:23.712856 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:23.712939 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:23.741964 1707070 cri.go:89] found id: ""
	I1124 09:29:23.741978 1707070 logs.go:282] 0 containers: []
	W1124 09:29:23.741985 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:23.741991 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:23.742067 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:23.766952 1707070 cri.go:89] found id: ""
	I1124 09:29:23.766966 1707070 logs.go:282] 0 containers: []
	W1124 09:29:23.766972 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:23.766978 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:23.767035 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:23.790992 1707070 cri.go:89] found id: ""
	I1124 09:29:23.791005 1707070 logs.go:282] 0 containers: []
	W1124 09:29:23.791013 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:23.791018 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:23.791073 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:23.819700 1707070 cri.go:89] found id: ""
	I1124 09:29:23.819713 1707070 logs.go:282] 0 containers: []
	W1124 09:29:23.819720 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:23.819726 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:23.819786 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:23.848657 1707070 cri.go:89] found id: ""
	I1124 09:29:23.848683 1707070 logs.go:282] 0 containers: []
	W1124 09:29:23.848690 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:23.848695 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:23.848754 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:23.873546 1707070 cri.go:89] found id: ""
	I1124 09:29:23.873571 1707070 logs.go:282] 0 containers: []
	W1124 09:29:23.873578 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:23.873584 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:23.873654 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:23.899519 1707070 cri.go:89] found id: ""
	I1124 09:29:23.899533 1707070 logs.go:282] 0 containers: []
	W1124 09:29:23.899547 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:23.899556 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:23.899568 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:23.954834 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:23.954854 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:23.971662 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:23.971680 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:24.041660 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:24.033560   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:24.034352   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:24.036062   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:24.036417   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:24.038032   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:24.033560   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:24.034352   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:24.036062   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:24.036417   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:24.038032   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:24.041670 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:24.041681 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:24.105146 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:24.105168 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:26.634760 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:26.646166 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:26.646251 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:26.679257 1707070 cri.go:89] found id: ""
	I1124 09:29:26.679271 1707070 logs.go:282] 0 containers: []
	W1124 09:29:26.679279 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:26.679284 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:26.679344 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:26.710754 1707070 cri.go:89] found id: ""
	I1124 09:29:26.710768 1707070 logs.go:282] 0 containers: []
	W1124 09:29:26.710775 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:26.710782 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:26.710840 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:26.735831 1707070 cri.go:89] found id: ""
	I1124 09:29:26.735845 1707070 logs.go:282] 0 containers: []
	W1124 09:29:26.735852 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:26.735857 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:26.735926 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:26.759918 1707070 cri.go:89] found id: ""
	I1124 09:29:26.759932 1707070 logs.go:282] 0 containers: []
	W1124 09:29:26.759939 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:26.759947 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:26.760002 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:26.783806 1707070 cri.go:89] found id: ""
	I1124 09:29:26.783825 1707070 logs.go:282] 0 containers: []
	W1124 09:29:26.783832 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:26.783838 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:26.783895 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:26.809230 1707070 cri.go:89] found id: ""
	I1124 09:29:26.809244 1707070 logs.go:282] 0 containers: []
	W1124 09:29:26.809252 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:26.809266 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:26.809331 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:26.836902 1707070 cri.go:89] found id: ""
	I1124 09:29:26.836916 1707070 logs.go:282] 0 containers: []
	W1124 09:29:26.836923 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:26.836931 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:26.836942 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:26.853955 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:26.853978 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:26.916186 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:26.907929   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:26.908672   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:26.910345   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:26.910937   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:26.912681   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:26.907929   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:26.908672   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:26.910345   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:26.910937   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:26.912681   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:26.916196 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:26.916218 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:26.980050 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:26.980072 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:27.010821 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:27.010838 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:29.573482 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:29.583518 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:29.583582 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:29.608188 1707070 cri.go:89] found id: ""
	I1124 09:29:29.608202 1707070 logs.go:282] 0 containers: []
	W1124 09:29:29.608209 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:29.608214 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:29.608270 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:29.641187 1707070 cri.go:89] found id: ""
	I1124 09:29:29.641201 1707070 logs.go:282] 0 containers: []
	W1124 09:29:29.641209 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:29.641214 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:29.641282 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:29.674249 1707070 cri.go:89] found id: ""
	I1124 09:29:29.674269 1707070 logs.go:282] 0 containers: []
	W1124 09:29:29.674276 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:29.674282 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:29.674339 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:29.700355 1707070 cri.go:89] found id: ""
	I1124 09:29:29.700370 1707070 logs.go:282] 0 containers: []
	W1124 09:29:29.700377 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:29.700382 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:29.700438 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:29.729232 1707070 cri.go:89] found id: ""
	I1124 09:29:29.729246 1707070 logs.go:282] 0 containers: []
	W1124 09:29:29.729253 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:29.729257 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:29.729313 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:29.756753 1707070 cri.go:89] found id: ""
	I1124 09:29:29.756766 1707070 logs.go:282] 0 containers: []
	W1124 09:29:29.756773 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:29.756788 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:29.756849 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:29.782318 1707070 cri.go:89] found id: ""
	I1124 09:29:29.782332 1707070 logs.go:282] 0 containers: []
	W1124 09:29:29.782339 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:29.782347 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:29.782358 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:29.837944 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:29.837963 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:29.855075 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:29.855094 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:29.916212 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:29.907972   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:29.908745   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:29.910447   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:29.910972   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:29.912670   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:29.907972   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:29.908745   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:29.910447   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:29.910972   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:29.912670   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:29.916221 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:29.916232 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:29.978681 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:29.978703 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:32.530833 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:32.541146 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:32.541251 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:32.566525 1707070 cri.go:89] found id: ""
	I1124 09:29:32.566540 1707070 logs.go:282] 0 containers: []
	W1124 09:29:32.566548 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:32.566554 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:32.566622 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:32.591741 1707070 cri.go:89] found id: ""
	I1124 09:29:32.591756 1707070 logs.go:282] 0 containers: []
	W1124 09:29:32.591763 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:32.591768 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:32.591826 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:32.617127 1707070 cri.go:89] found id: ""
	I1124 09:29:32.617141 1707070 logs.go:282] 0 containers: []
	W1124 09:29:32.617148 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:32.617153 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:32.617209 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:32.654493 1707070 cri.go:89] found id: ""
	I1124 09:29:32.654507 1707070 logs.go:282] 0 containers: []
	W1124 09:29:32.654515 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:32.654521 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:32.654580 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:32.685080 1707070 cri.go:89] found id: ""
	I1124 09:29:32.685094 1707070 logs.go:282] 0 containers: []
	W1124 09:29:32.685101 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:32.685106 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:32.685180 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:32.715751 1707070 cri.go:89] found id: ""
	I1124 09:29:32.715766 1707070 logs.go:282] 0 containers: []
	W1124 09:29:32.715782 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:32.715788 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:32.715850 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:32.742395 1707070 cri.go:89] found id: ""
	I1124 09:29:32.742409 1707070 logs.go:282] 0 containers: []
	W1124 09:29:32.742416 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:32.742424 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:32.742434 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:32.760261 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:32.760278 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:32.828736 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:32.819577   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:32.820328   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:32.822013   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:32.822622   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:32.824506   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:32.819577   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:32.820328   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:32.822013   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:32.822622   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:32.824506   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:32.828746 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:32.828759 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:32.896940 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:32.896965 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:32.928695 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:32.928711 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:35.485941 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:35.496873 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:35.496934 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:35.525748 1707070 cri.go:89] found id: ""
	I1124 09:29:35.525782 1707070 logs.go:282] 0 containers: []
	W1124 09:29:35.525791 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:35.525796 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:35.525866 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:35.553111 1707070 cri.go:89] found id: ""
	I1124 09:29:35.553126 1707070 logs.go:282] 0 containers: []
	W1124 09:29:35.553134 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:35.553142 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:35.553220 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:35.578594 1707070 cri.go:89] found id: ""
	I1124 09:29:35.578622 1707070 logs.go:282] 0 containers: []
	W1124 09:29:35.578629 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:35.578635 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:35.578706 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:35.607322 1707070 cri.go:89] found id: ""
	I1124 09:29:35.607336 1707070 logs.go:282] 0 containers: []
	W1124 09:29:35.607343 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:35.607348 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:35.607417 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:35.638865 1707070 cri.go:89] found id: ""
	I1124 09:29:35.638880 1707070 logs.go:282] 0 containers: []
	W1124 09:29:35.638887 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:35.638893 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:35.638960 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:35.672327 1707070 cri.go:89] found id: ""
	I1124 09:29:35.672352 1707070 logs.go:282] 0 containers: []
	W1124 09:29:35.672360 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:35.672365 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:35.672431 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:35.700255 1707070 cri.go:89] found id: ""
	I1124 09:29:35.700269 1707070 logs.go:282] 0 containers: []
	W1124 09:29:35.700277 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:35.700285 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:35.700297 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:35.758017 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:35.758037 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:35.775326 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:35.775344 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:35.842090 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:35.833688   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:35.834400   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:35.836148   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:35.836802   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:35.838521   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:35.833688   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:35.834400   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:35.836148   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:35.836802   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:35.838521   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:35.842100 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:35.842120 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:35.908742 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:35.908769 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:38.443689 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:38.453968 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:38.454035 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:38.477762 1707070 cri.go:89] found id: ""
	I1124 09:29:38.477776 1707070 logs.go:282] 0 containers: []
	W1124 09:29:38.477783 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:38.477789 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:38.477853 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:38.506120 1707070 cri.go:89] found id: ""
	I1124 09:29:38.506134 1707070 logs.go:282] 0 containers: []
	W1124 09:29:38.506141 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:38.506147 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:38.506203 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:38.530669 1707070 cri.go:89] found id: ""
	I1124 09:29:38.530691 1707070 logs.go:282] 0 containers: []
	W1124 09:29:38.530699 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:38.530705 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:38.530763 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:38.560535 1707070 cri.go:89] found id: ""
	I1124 09:29:38.560558 1707070 logs.go:282] 0 containers: []
	W1124 09:29:38.560565 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:38.560572 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:38.560631 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:38.586535 1707070 cri.go:89] found id: ""
	I1124 09:29:38.586549 1707070 logs.go:282] 0 containers: []
	W1124 09:29:38.586556 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:38.586561 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:38.586620 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:38.611101 1707070 cri.go:89] found id: ""
	I1124 09:29:38.611115 1707070 logs.go:282] 0 containers: []
	W1124 09:29:38.611122 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:38.611127 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:38.611186 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:38.643467 1707070 cri.go:89] found id: ""
	I1124 09:29:38.643482 1707070 logs.go:282] 0 containers: []
	W1124 09:29:38.643489 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:38.643497 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:38.643508 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:38.708197 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:38.708218 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:38.725978 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:38.725995 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:38.789806 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:38.781672   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:38.782397   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:38.783993   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:38.784577   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:38.786135   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:38.781672   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:38.782397   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:38.783993   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:38.784577   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:38.786135   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:38.789818 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:38.789828 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:38.853085 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:38.853106 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:41.387044 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:41.398117 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:41.398183 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:41.424537 1707070 cri.go:89] found id: ""
	I1124 09:29:41.424551 1707070 logs.go:282] 0 containers: []
	W1124 09:29:41.424558 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:41.424564 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:41.424626 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:41.454716 1707070 cri.go:89] found id: ""
	I1124 09:29:41.454730 1707070 logs.go:282] 0 containers: []
	W1124 09:29:41.454737 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:41.454742 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:41.454801 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:41.479954 1707070 cri.go:89] found id: ""
	I1124 09:29:41.479969 1707070 logs.go:282] 0 containers: []
	W1124 09:29:41.479976 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:41.479981 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:41.480041 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:41.505560 1707070 cri.go:89] found id: ""
	I1124 09:29:41.505575 1707070 logs.go:282] 0 containers: []
	W1124 09:29:41.505582 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:41.505593 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:41.505654 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:41.530996 1707070 cri.go:89] found id: ""
	I1124 09:29:41.531010 1707070 logs.go:282] 0 containers: []
	W1124 09:29:41.531018 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:41.531024 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:41.531090 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:41.557489 1707070 cri.go:89] found id: ""
	I1124 09:29:41.557502 1707070 logs.go:282] 0 containers: []
	W1124 09:29:41.557510 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:41.557516 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:41.557575 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:41.587178 1707070 cri.go:89] found id: ""
	I1124 09:29:41.587192 1707070 logs.go:282] 0 containers: []
	W1124 09:29:41.587199 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:41.587207 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:41.587217 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:41.644853 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:41.644873 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:41.664905 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:41.664924 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:41.731530 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:41.723947   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:41.724430   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:41.726128   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:41.726440   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:41.727892   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:41.723947   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:41.724430   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:41.726128   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:41.726440   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:41.727892   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:41.731540 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:41.731550 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:41.793965 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:41.793985 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:44.323959 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:44.334291 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:44.334352 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:44.364183 1707070 cri.go:89] found id: ""
	I1124 09:29:44.364199 1707070 logs.go:282] 0 containers: []
	W1124 09:29:44.364206 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:44.364212 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:44.364285 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:44.391116 1707070 cri.go:89] found id: ""
	I1124 09:29:44.391130 1707070 logs.go:282] 0 containers: []
	W1124 09:29:44.391137 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:44.391142 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:44.391199 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:44.416448 1707070 cri.go:89] found id: ""
	I1124 09:29:44.416462 1707070 logs.go:282] 0 containers: []
	W1124 09:29:44.416470 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:44.416476 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:44.416533 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:44.442027 1707070 cri.go:89] found id: ""
	I1124 09:29:44.442042 1707070 logs.go:282] 0 containers: []
	W1124 09:29:44.442059 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:44.442065 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:44.442124 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:44.467492 1707070 cri.go:89] found id: ""
	I1124 09:29:44.467516 1707070 logs.go:282] 0 containers: []
	W1124 09:29:44.467525 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:44.467531 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:44.467643 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:44.492900 1707070 cri.go:89] found id: ""
	I1124 09:29:44.492914 1707070 logs.go:282] 0 containers: []
	W1124 09:29:44.492921 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:44.492927 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:44.492986 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:44.518419 1707070 cri.go:89] found id: ""
	I1124 09:29:44.518434 1707070 logs.go:282] 0 containers: []
	W1124 09:29:44.518441 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:44.518449 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:44.518479 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:44.584407 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:44.584427 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:44.616287 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:44.616305 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:44.680013 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:44.680033 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:44.702644 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:44.702662 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:44.770803 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:44.761924   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:44.762682   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:44.764417   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:44.765036   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:44.766673   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:44.761924   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:44.762682   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:44.764417   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:44.765036   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:44.766673   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:47.271699 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:47.283580 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:47.283646 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:47.309341 1707070 cri.go:89] found id: ""
	I1124 09:29:47.309355 1707070 logs.go:282] 0 containers: []
	W1124 09:29:47.309368 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:47.309385 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:47.309443 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:47.335187 1707070 cri.go:89] found id: ""
	I1124 09:29:47.335202 1707070 logs.go:282] 0 containers: []
	W1124 09:29:47.335209 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:47.335214 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:47.335273 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:47.362876 1707070 cri.go:89] found id: ""
	I1124 09:29:47.362891 1707070 logs.go:282] 0 containers: []
	W1124 09:29:47.362898 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:47.362904 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:47.362964 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:47.388290 1707070 cri.go:89] found id: ""
	I1124 09:29:47.388304 1707070 logs.go:282] 0 containers: []
	W1124 09:29:47.388311 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:47.388317 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:47.388374 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:47.416544 1707070 cri.go:89] found id: ""
	I1124 09:29:47.416558 1707070 logs.go:282] 0 containers: []
	W1124 09:29:47.416565 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:47.416570 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:47.416629 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:47.441861 1707070 cri.go:89] found id: ""
	I1124 09:29:47.441875 1707070 logs.go:282] 0 containers: []
	W1124 09:29:47.441902 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:47.441909 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:47.441978 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:47.465857 1707070 cri.go:89] found id: ""
	I1124 09:29:47.465879 1707070 logs.go:282] 0 containers: []
	W1124 09:29:47.465886 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:47.465894 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:47.465905 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:47.523429 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:47.523450 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:47.540445 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:47.540462 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:47.607683 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:47.599524   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:47.600165   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:47.601865   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:47.602402   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:47.603965   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:47.599524   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:47.600165   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:47.601865   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:47.602402   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:47.603965   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:47.607694 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:47.607704 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:47.682000 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:47.682023 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:50.218599 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:50.229182 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:50.229254 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:50.254129 1707070 cri.go:89] found id: ""
	I1124 09:29:50.254143 1707070 logs.go:282] 0 containers: []
	W1124 09:29:50.254150 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:50.254155 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:50.254219 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:50.280233 1707070 cri.go:89] found id: ""
	I1124 09:29:50.280247 1707070 logs.go:282] 0 containers: []
	W1124 09:29:50.280254 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:50.280260 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:50.280317 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:50.304403 1707070 cri.go:89] found id: ""
	I1124 09:29:50.304417 1707070 logs.go:282] 0 containers: []
	W1124 09:29:50.304424 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:50.304430 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:50.304492 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:50.329881 1707070 cri.go:89] found id: ""
	I1124 09:29:50.329897 1707070 logs.go:282] 0 containers: []
	W1124 09:29:50.329904 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:50.329910 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:50.329987 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:50.358124 1707070 cri.go:89] found id: ""
	I1124 09:29:50.358139 1707070 logs.go:282] 0 containers: []
	W1124 09:29:50.358149 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:50.358158 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:50.358246 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:50.384151 1707070 cri.go:89] found id: ""
	I1124 09:29:50.384165 1707070 logs.go:282] 0 containers: []
	W1124 09:29:50.384178 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:50.384196 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:50.384254 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:50.408884 1707070 cri.go:89] found id: ""
	I1124 09:29:50.408899 1707070 logs.go:282] 0 containers: []
	W1124 09:29:50.408906 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:50.408914 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:50.408925 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:50.464122 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:50.464147 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:50.480720 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:50.480736 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:50.544337 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:50.536334   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:50.536956   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:50.538555   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:50.539042   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:50.540634   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:50.536334   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:50.536956   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:50.538555   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:50.539042   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:50.540634   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:50.544348 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:50.544361 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:50.606972 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:50.606993 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:53.143446 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:53.154359 1707070 kubeadm.go:602] duration metric: took 4m4.065975367s to restartPrimaryControlPlane
	W1124 09:29:53.154423 1707070 out.go:285] ! Unable to restart control-plane node(s), will reset cluster: <no value>
	I1124 09:29:53.154529 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1124 09:29:53.563147 1707070 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1124 09:29:53.576942 1707070 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1124 09:29:53.584698 1707070 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1124 09:29:53.584758 1707070 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1124 09:29:53.592605 1707070 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1124 09:29:53.592613 1707070 kubeadm.go:158] found existing configuration files:
	
	I1124 09:29:53.592678 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1124 09:29:53.600460 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1124 09:29:53.600517 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1124 09:29:53.607615 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1124 09:29:53.615236 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1124 09:29:53.615293 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1124 09:29:53.622532 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1124 09:29:53.630501 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1124 09:29:53.630562 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1124 09:29:53.638386 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1124 09:29:53.646257 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1124 09:29:53.646321 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1124 09:29:53.653836 1707070 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1124 09:29:53.692708 1707070 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-beta.0
	I1124 09:29:53.692756 1707070 kubeadm.go:319] [preflight] Running pre-flight checks
	I1124 09:29:53.765347 1707070 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1124 09:29:53.765413 1707070 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1124 09:29:53.765447 1707070 kubeadm.go:319] OS: Linux
	I1124 09:29:53.765490 1707070 kubeadm.go:319] CGROUPS_CPU: enabled
	I1124 09:29:53.765537 1707070 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1124 09:29:53.765589 1707070 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1124 09:29:53.765636 1707070 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1124 09:29:53.765682 1707070 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1124 09:29:53.765729 1707070 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1124 09:29:53.765772 1707070 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1124 09:29:53.765819 1707070 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1124 09:29:53.765864 1707070 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1124 09:29:53.828877 1707070 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1124 09:29:53.829001 1707070 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1124 09:29:53.829104 1707070 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1124 09:29:53.834791 1707070 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1124 09:29:53.838245 1707070 out.go:252]   - Generating certificates and keys ...
	I1124 09:29:53.838369 1707070 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1124 09:29:53.838434 1707070 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1124 09:29:53.838527 1707070 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1124 09:29:53.838616 1707070 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1124 09:29:53.838701 1707070 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1124 09:29:53.838784 1707070 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1124 09:29:53.838854 1707070 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1124 09:29:53.838919 1707070 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1124 09:29:53.839002 1707070 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1124 09:29:53.839386 1707070 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1124 09:29:53.839639 1707070 kubeadm.go:319] [certs] Using the existing "sa" key
	I1124 09:29:53.839706 1707070 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1124 09:29:54.545063 1707070 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1124 09:29:55.036514 1707070 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1124 09:29:55.148786 1707070 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1124 09:29:55.311399 1707070 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1124 09:29:55.656188 1707070 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1124 09:29:55.656996 1707070 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1124 09:29:55.659590 1707070 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1124 09:29:55.662658 1707070 out.go:252]   - Booting up control plane ...
	I1124 09:29:55.662786 1707070 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1124 09:29:55.662870 1707070 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1124 09:29:55.664747 1707070 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1124 09:29:55.686536 1707070 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1124 09:29:55.686657 1707070 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1124 09:29:55.694440 1707070 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1124 09:29:55.694885 1707070 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1124 09:29:55.694934 1707070 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1124 09:29:55.830944 1707070 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1124 09:29:55.831051 1707070 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1124 09:33:55.829210 1707070 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000251849s
	I1124 09:33:55.829235 1707070 kubeadm.go:319] 
	I1124 09:33:55.829291 1707070 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1124 09:33:55.829323 1707070 kubeadm.go:319] 	- The kubelet is not running
	I1124 09:33:55.829428 1707070 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1124 09:33:55.829432 1707070 kubeadm.go:319] 
	I1124 09:33:55.829536 1707070 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1124 09:33:55.829573 1707070 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1124 09:33:55.829603 1707070 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1124 09:33:55.829606 1707070 kubeadm.go:319] 
	I1124 09:33:55.833661 1707070 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1124 09:33:55.834099 1707070 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1124 09:33:55.834220 1707070 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1124 09:33:55.834508 1707070 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1124 09:33:55.834517 1707070 kubeadm.go:319] 
	I1124 09:33:55.834670 1707070 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	W1124 09:33:55.834735 1707070 out.go:285] ! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000251849s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	I1124 09:33:55.834825 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1124 09:33:56.243415 1707070 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1124 09:33:56.256462 1707070 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1124 09:33:56.256517 1707070 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1124 09:33:56.264387 1707070 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1124 09:33:56.264397 1707070 kubeadm.go:158] found existing configuration files:
	
	I1124 09:33:56.264448 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1124 09:33:56.272152 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1124 09:33:56.272210 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1124 09:33:56.279938 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1124 09:33:56.287667 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1124 09:33:56.287720 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1124 09:33:56.295096 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1124 09:33:56.302699 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1124 09:33:56.302758 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1124 09:33:56.310421 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1124 09:33:56.318128 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1124 09:33:56.318183 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1124 09:33:56.325438 1707070 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1124 09:33:56.364513 1707070 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-beta.0
	I1124 09:33:56.364563 1707070 kubeadm.go:319] [preflight] Running pre-flight checks
	I1124 09:33:56.440273 1707070 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1124 09:33:56.440340 1707070 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1124 09:33:56.440376 1707070 kubeadm.go:319] OS: Linux
	I1124 09:33:56.440420 1707070 kubeadm.go:319] CGROUPS_CPU: enabled
	I1124 09:33:56.440467 1707070 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1124 09:33:56.440513 1707070 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1124 09:33:56.440560 1707070 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1124 09:33:56.440606 1707070 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1124 09:33:56.440654 1707070 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1124 09:33:56.440697 1707070 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1124 09:33:56.440749 1707070 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1124 09:33:56.440794 1707070 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1124 09:33:56.504487 1707070 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1124 09:33:56.504590 1707070 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1124 09:33:56.504685 1707070 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1124 09:33:56.510220 1707070 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1124 09:33:56.513847 1707070 out.go:252]   - Generating certificates and keys ...
	I1124 09:33:56.513936 1707070 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1124 09:33:56.514003 1707070 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1124 09:33:56.514078 1707070 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1124 09:33:56.514137 1707070 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1124 09:33:56.514205 1707070 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1124 09:33:56.514264 1707070 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1124 09:33:56.514326 1707070 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1124 09:33:56.514386 1707070 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1124 09:33:56.514481 1707070 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1124 09:33:56.514553 1707070 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1124 09:33:56.514589 1707070 kubeadm.go:319] [certs] Using the existing "sa" key
	I1124 09:33:56.514644 1707070 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1124 09:33:57.046366 1707070 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1124 09:33:57.432965 1707070 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1124 09:33:57.802873 1707070 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1124 09:33:58.414576 1707070 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1124 09:33:58.520825 1707070 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1124 09:33:58.522049 1707070 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1124 09:33:58.526436 1707070 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1124 09:33:58.529676 1707070 out.go:252]   - Booting up control plane ...
	I1124 09:33:58.529779 1707070 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1124 09:33:58.529855 1707070 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1124 09:33:58.529921 1707070 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1124 09:33:58.549683 1707070 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1124 09:33:58.549801 1707070 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1124 09:33:58.557327 1707070 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1124 09:33:58.557589 1707070 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1124 09:33:58.557812 1707070 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1124 09:33:58.696439 1707070 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1124 09:33:58.696553 1707070 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1124 09:37:58.697446 1707070 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.001230859s
	I1124 09:37:58.697472 1707070 kubeadm.go:319] 
	I1124 09:37:58.697558 1707070 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1124 09:37:58.697602 1707070 kubeadm.go:319] 	- The kubelet is not running
	I1124 09:37:58.697730 1707070 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1124 09:37:58.697737 1707070 kubeadm.go:319] 
	I1124 09:37:58.697847 1707070 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1124 09:37:58.697878 1707070 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1124 09:37:58.697921 1707070 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1124 09:37:58.697925 1707070 kubeadm.go:319] 
	I1124 09:37:58.701577 1707070 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1124 09:37:58.701990 1707070 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1124 09:37:58.702104 1707070 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1124 09:37:58.702344 1707070 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1124 09:37:58.702350 1707070 kubeadm.go:319] 
	I1124 09:37:58.702417 1707070 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1124 09:37:58.702481 1707070 kubeadm.go:403] duration metric: took 12m9.652556415s to StartCluster
	I1124 09:37:58.702514 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:37:58.702578 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:37:58.726968 1707070 cri.go:89] found id: ""
	I1124 09:37:58.726981 1707070 logs.go:282] 0 containers: []
	W1124 09:37:58.726988 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:37:58.726994 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:37:58.727055 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:37:58.756184 1707070 cri.go:89] found id: ""
	I1124 09:37:58.756198 1707070 logs.go:282] 0 containers: []
	W1124 09:37:58.756205 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:37:58.756210 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:37:58.756266 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:37:58.781056 1707070 cri.go:89] found id: ""
	I1124 09:37:58.781070 1707070 logs.go:282] 0 containers: []
	W1124 09:37:58.781077 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:37:58.781082 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:37:58.781145 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:37:58.805769 1707070 cri.go:89] found id: ""
	I1124 09:37:58.805783 1707070 logs.go:282] 0 containers: []
	W1124 09:37:58.805790 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:37:58.805796 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:37:58.805854 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:37:58.830758 1707070 cri.go:89] found id: ""
	I1124 09:37:58.830780 1707070 logs.go:282] 0 containers: []
	W1124 09:37:58.830791 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:37:58.830797 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:37:58.830857 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:37:58.855967 1707070 cri.go:89] found id: ""
	I1124 09:37:58.855981 1707070 logs.go:282] 0 containers: []
	W1124 09:37:58.855988 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:37:58.855994 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:37:58.856051 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:37:58.890842 1707070 cri.go:89] found id: ""
	I1124 09:37:58.890857 1707070 logs.go:282] 0 containers: []
	W1124 09:37:58.890865 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:37:58.890873 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:37:58.890885 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:37:58.910142 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:37:58.910157 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:37:58.985463 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:37:58.976283   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:37:58.977104   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:37:58.978904   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:37:58.979496   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:37:58.981268   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:37:58.976283   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:37:58.977104   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:37:58.978904   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:37:58.979496   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:37:58.981268   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:37:58.985474 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:37:58.985486 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:37:59.051823 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:37:59.051845 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:37:59.080123 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:37:59.080139 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	W1124 09:37:59.137954 1707070 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001230859s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	W1124 09:37:59.138000 1707070 out.go:285] * 
	W1124 09:37:59.138117 1707070 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001230859s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1124 09:37:59.138177 1707070 out.go:285] * 
	W1124 09:37:59.140306 1707070 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1124 09:37:59.145839 1707070 out.go:203] 
	W1124 09:37:59.149636 1707070 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001230859s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1124 09:37:59.149678 1707070 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1124 09:37:59.149707 1707070 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1124 09:37:59.153358 1707070 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Nov 24 09:38:08 functional-291288 containerd[10324]: time="2025-11-24T09:38:08.742098573Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-291288\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:38:09 functional-291288 containerd[10324]: time="2025-11-24T09:38:09.725115511Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-291288\""
	Nov 24 09:38:09 functional-291288 containerd[10324]: time="2025-11-24T09:38:09.727769003Z" level=info msg="ImageDelete event name:\"docker.io/kicbase/echo-server:functional-291288\""
	Nov 24 09:38:09 functional-291288 containerd[10324]: time="2025-11-24T09:38:09.729992993Z" level=info msg="ImageDelete event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\""
	Nov 24 09:38:09 functional-291288 containerd[10324]: time="2025-11-24T09:38:09.740634129Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-291288\" returns successfully"
	Nov 24 09:38:10 functional-291288 containerd[10324]: time="2025-11-24T09:38:10.017725287Z" level=info msg="No images store for sha256:af1a838d2702e4e84137a83a66ae93ebb59c7bf115bf022cc84ce1a55dfd3fb4"
	Nov 24 09:38:10 functional-291288 containerd[10324]: time="2025-11-24T09:38:10.020247594Z" level=info msg="ImageCreate event name:\"docker.io/kicbase/echo-server:functional-291288\""
	Nov 24 09:38:10 functional-291288 containerd[10324]: time="2025-11-24T09:38:10.028698216Z" level=info msg="ImageCreate event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:38:10 functional-291288 containerd[10324]: time="2025-11-24T09:38:10.029232770Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-291288\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:38:11 functional-291288 containerd[10324]: time="2025-11-24T09:38:11.459119625Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-291288\""
	Nov 24 09:38:11 functional-291288 containerd[10324]: time="2025-11-24T09:38:11.462708306Z" level=info msg="ImageDelete event name:\"docker.io/kicbase/echo-server:functional-291288\""
	Nov 24 09:38:11 functional-291288 containerd[10324]: time="2025-11-24T09:38:11.465197440Z" level=info msg="ImageDelete event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\""
	Nov 24 09:38:11 functional-291288 containerd[10324]: time="2025-11-24T09:38:11.482046877Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-291288\" returns successfully"
	Nov 24 09:38:11 functional-291288 containerd[10324]: time="2025-11-24T09:38:11.784805820Z" level=info msg="No images store for sha256:af1a838d2702e4e84137a83a66ae93ebb59c7bf115bf022cc84ce1a55dfd3fb4"
	Nov 24 09:38:11 functional-291288 containerd[10324]: time="2025-11-24T09:38:11.787158164Z" level=info msg="ImageCreate event name:\"docker.io/kicbase/echo-server:functional-291288\""
	Nov 24 09:38:11 functional-291288 containerd[10324]: time="2025-11-24T09:38:11.795127091Z" level=info msg="ImageCreate event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:38:11 functional-291288 containerd[10324]: time="2025-11-24T09:38:11.795603535Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-291288\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:38:12 functional-291288 containerd[10324]: time="2025-11-24T09:38:12.816798467Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-291288\""
	Nov 24 09:38:12 functional-291288 containerd[10324]: time="2025-11-24T09:38:12.819240765Z" level=info msg="ImageDelete event name:\"docker.io/kicbase/echo-server:functional-291288\""
	Nov 24 09:38:12 functional-291288 containerd[10324]: time="2025-11-24T09:38:12.822304091Z" level=info msg="ImageDelete event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\""
	Nov 24 09:38:12 functional-291288 containerd[10324]: time="2025-11-24T09:38:12.835765777Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-291288\" returns successfully"
	Nov 24 09:38:13 functional-291288 containerd[10324]: time="2025-11-24T09:38:13.649607557Z" level=info msg="No images store for sha256:80154cc39374c5be6259fccbd4295ce399d3a1d7b6e10b99200044587775c910"
	Nov 24 09:38:13 functional-291288 containerd[10324]: time="2025-11-24T09:38:13.651890157Z" level=info msg="ImageCreate event name:\"docker.io/kicbase/echo-server:functional-291288\""
	Nov 24 09:38:13 functional-291288 containerd[10324]: time="2025-11-24T09:38:13.659732716Z" level=info msg="ImageCreate event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:38:13 functional-291288 containerd[10324]: time="2025-11-24T09:38:13.660101507Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-291288\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:39:55.456118   24221 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:39:55.456545   24221 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:39:55.458210   24221 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:39:55.458632   24221 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:39:55.460205   24221 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Nov24 08:20] overlayfs: idmapped layers are currently not supported
	[Nov24 08:21] overlayfs: idmapped layers are currently not supported
	[Nov24 08:22] overlayfs: idmapped layers are currently not supported
	[Nov24 08:23] overlayfs: idmapped layers are currently not supported
	[Nov24 08:24] overlayfs: idmapped layers are currently not supported
	[ +19.566509] overlayfs: idmapped layers are currently not supported
	[Nov24 08:25] overlayfs: idmapped layers are currently not supported
	[ +35.934095] overlayfs: idmapped layers are currently not supported
	[Nov24 08:27] overlayfs: idmapped layers are currently not supported
	[Nov24 08:28] overlayfs: idmapped layers are currently not supported
	[Nov24 08:29] overlayfs: idmapped layers are currently not supported
	[Nov24 08:30] overlayfs: idmapped layers are currently not supported
	[Nov24 08:31] overlayfs: idmapped layers are currently not supported
	[ +44.505180] overlayfs: idmapped layers are currently not supported
	[Nov24 08:32] overlayfs: idmapped layers are currently not supported
	[Nov24 08:33] overlayfs: idmapped layers are currently not supported
	[Nov24 08:34] overlayfs: idmapped layers are currently not supported
	[ +21.137877] overlayfs: idmapped layers are currently not supported
	[ +28.724175] overlayfs: idmapped layers are currently not supported
	[Nov24 08:36] overlayfs: idmapped layers are currently not supported
	[Nov24 08:37] overlayfs: idmapped layers are currently not supported
	[Nov24 08:38] overlayfs: idmapped layers are currently not supported
	[  +4.063834] overlayfs: idmapped layers are currently not supported
	[Nov24 08:40] overlayfs: idmapped layers are currently not supported
	[Nov24 08:42] kauditd_printk_skb: 8 callbacks suppressed
	
	
	==> kernel <==
	 09:39:55 up  8:22,  0 user,  load average: 1.01, 0.45, 0.40
	Linux functional-291288 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Nov 24 09:39:52 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:39:52 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 472.
	Nov 24 09:39:52 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:39:52 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:39:52 functional-291288 kubelet[24046]: E1124 09:39:52.939837   24046 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:39:52 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:39:52 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:39:53 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 473.
	Nov 24 09:39:53 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:39:53 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:39:53 functional-291288 kubelet[24080]: E1124 09:39:53.617444   24080 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:39:53 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:39:53 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:39:54 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 474.
	Nov 24 09:39:54 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:39:54 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:39:54 functional-291288 kubelet[24118]: E1124 09:39:54.448151   24118 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:39:54 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:39:54 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:39:55 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 475.
	Nov 24 09:39:55 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:39:55 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:39:55 functional-291288 kubelet[24145]: E1124 09:39:55.208216   24145 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:39:55 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:39:55 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-291288 -n functional-291288
helpers_test.go:262: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-291288 -n functional-291288: exit status 2 (359.0125ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:262: status error: exit status 2 (may be ok)
helpers_test.go:264: "functional-291288" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/StatusCmd (3.18s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmdConnect (2.44s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmdConnect
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmdConnect

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmdConnect
functional_test.go:1636: (dbg) Run:  kubectl --context functional-291288 create deployment hello-node-connect --image kicbase/echo-server
functional_test.go:1636: (dbg) Non-zero exit: kubectl --context functional-291288 create deployment hello-node-connect --image kicbase/echo-server: exit status 1 (54.528086ms)

                                                
                                                
** stderr ** 
	error: failed to create deployment: Post "https://192.168.49.2:8441/apis/apps/v1/namespaces/default/deployments?fieldManager=kubectl-create&fieldValidation=Strict": dial tcp 192.168.49.2:8441: connect: connection refused

                                                
                                                
** /stderr **
functional_test.go:1638: failed to create hello-node deployment with this command "kubectl --context functional-291288 create deployment hello-node-connect --image kicbase/echo-server": exit status 1.
functional_test.go:1608: service test failed - dumping debug information
functional_test.go:1609: -----------------------service failure post-mortem--------------------------------
functional_test.go:1612: (dbg) Run:  kubectl --context functional-291288 describe po hello-node-connect
functional_test.go:1612: (dbg) Non-zero exit: kubectl --context functional-291288 describe po hello-node-connect: exit status 1 (65.640816ms)

                                                
                                                
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:1614: "kubectl --context functional-291288 describe po hello-node-connect" failed: exit status 1
functional_test.go:1616: hello-node pod describe:
functional_test.go:1618: (dbg) Run:  kubectl --context functional-291288 logs -l app=hello-node-connect
functional_test.go:1618: (dbg) Non-zero exit: kubectl --context functional-291288 logs -l app=hello-node-connect: exit status 1 (55.236188ms)

                                                
                                                
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:1620: "kubectl --context functional-291288 logs -l app=hello-node-connect" failed: exit status 1
functional_test.go:1622: hello-node logs:
functional_test.go:1624: (dbg) Run:  kubectl --context functional-291288 describe svc hello-node-connect
functional_test.go:1624: (dbg) Non-zero exit: kubectl --context functional-291288 describe svc hello-node-connect: exit status 1 (67.603143ms)

                                                
                                                
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:1626: "kubectl --context functional-291288 describe svc hello-node-connect" failed: exit status 1
functional_test.go:1628: hello-node svc describe:
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmdConnect]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmdConnect]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect functional-291288
helpers_test.go:243: (dbg) docker inspect functional-291288:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52",
	        "Created": "2025-11-24T09:10:51.896020191Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 1695240,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-11-24T09:10:51.968983407Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:572c983e466f1f784136812eef5cc59ac623db764bc7704d3676c4643993fd08",
	        "ResolvConfPath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/hostname",
	        "HostsPath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/hosts",
	        "LogPath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52-json.log",
	        "Name": "/functional-291288",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "functional-291288:/var",
	                "/lib/modules:/lib/modules:ro"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-291288",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52",
	                "LowerDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7-init/diff:/var/lib/docker/overlay2/bf294786c7ade59cb761c7bdcc27045362c3779b00a569b685443f34ade17737/diff",
	                "MergedDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7/merged",
	                "UpperDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7/diff",
	                "WorkDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "volume",
	                "Name": "functional-291288",
	                "Source": "/var/lib/docker/volumes/functional-291288/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            },
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-291288",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-291288",
	                "name.minikube.sigs.k8s.io": "functional-291288",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "09c1c2eef0dca6362dde63b4cbc372c0cfa3e4fd084b8745043d8b88925691bf",
	            "SandboxKey": "/var/run/docker/netns/09c1c2eef0dc",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34684"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34685"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34688"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34686"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34687"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-291288": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "7e:49:22:0b:f9:2c",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "1e8f91e8ad9f46b831bbb1b0589b0022d940ee9875e64a648dc80612f3ca93dc",
	                    "EndpointID": "5de5ca8ccb07584b21e6e4e30dba12e0233e8d28c3e48e705cddffe75263b337",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-291288",
	                        "70848be15fcc"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-291288 -n functional-291288
helpers_test.go:247: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-291288 -n functional-291288: exit status 2 (315.315014ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:247: status error: exit status 2 (may be ok)
helpers_test.go:252: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmdConnect FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmdConnect]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 logs -n 25
helpers_test.go:260: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmdConnect logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                              ARGS                                                                               │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ image   │ functional-291288 image ls                                                                                                                                      │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:38 UTC │ 24 Nov 25 09:38 UTC │
	│ image   │ functional-291288 image load --daemon kicbase/echo-server:functional-291288 --alsologtostderr                                                                   │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:38 UTC │ 24 Nov 25 09:38 UTC │
	│ image   │ functional-291288 image ls                                                                                                                                      │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:38 UTC │ 24 Nov 25 09:38 UTC │
	│ image   │ functional-291288 image load --daemon kicbase/echo-server:functional-291288 --alsologtostderr                                                                   │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:38 UTC │ 24 Nov 25 09:38 UTC │
	│ ssh     │ functional-291288 ssh sudo cat /etc/ssl/certs/1654467.pem                                                                                                       │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:38 UTC │ 24 Nov 25 09:38 UTC │
	│ ssh     │ functional-291288 ssh sudo cat /usr/share/ca-certificates/1654467.pem                                                                                           │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:38 UTC │ 24 Nov 25 09:38 UTC │
	│ image   │ functional-291288 image ls                                                                                                                                      │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:38 UTC │ 24 Nov 25 09:38 UTC │
	│ ssh     │ functional-291288 ssh sudo cat /etc/ssl/certs/51391683.0                                                                                                        │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:38 UTC │ 24 Nov 25 09:38 UTC │
	│ image   │ functional-291288 image save kicbase/echo-server:functional-291288 /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:38 UTC │ 24 Nov 25 09:38 UTC │
	│ ssh     │ functional-291288 ssh sudo cat /etc/ssl/certs/16544672.pem                                                                                                      │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:38 UTC │ 24 Nov 25 09:38 UTC │
	│ image   │ functional-291288 image rm kicbase/echo-server:functional-291288 --alsologtostderr                                                                              │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:38 UTC │ 24 Nov 25 09:38 UTC │
	│ ssh     │ functional-291288 ssh sudo cat /usr/share/ca-certificates/16544672.pem                                                                                          │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:38 UTC │ 24 Nov 25 09:38 UTC │
	│ image   │ functional-291288 image ls                                                                                                                                      │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:38 UTC │ 24 Nov 25 09:38 UTC │
	│ ssh     │ functional-291288 ssh sudo cat /etc/ssl/certs/3ec20f2e.0                                                                                                        │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:38 UTC │ 24 Nov 25 09:38 UTC │
	│ image   │ functional-291288 image load /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr                                       │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:38 UTC │ 24 Nov 25 09:38 UTC │
	│ ssh     │ functional-291288 ssh sudo cat /etc/test/nested/copy/1654467/hosts                                                                                              │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:38 UTC │ 24 Nov 25 09:38 UTC │
	│ image   │ functional-291288 image ls                                                                                                                                      │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:38 UTC │ 24 Nov 25 09:38 UTC │
	│ ssh     │ functional-291288 ssh echo hello                                                                                                                                │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:38 UTC │ 24 Nov 25 09:38 UTC │
	│ image   │ functional-291288 image save --daemon kicbase/echo-server:functional-291288 --alsologtostderr                                                                   │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:38 UTC │ 24 Nov 25 09:38 UTC │
	│ ssh     │ functional-291288 ssh cat /etc/hostname                                                                                                                         │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:38 UTC │ 24 Nov 25 09:38 UTC │
	│ tunnel  │ functional-291288 tunnel --alsologtostderr                                                                                                                      │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:38 UTC │                     │
	│ tunnel  │ functional-291288 tunnel --alsologtostderr                                                                                                                      │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:38 UTC │                     │
	│ tunnel  │ functional-291288 tunnel --alsologtostderr                                                                                                                      │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:38 UTC │                     │
	│ addons  │ functional-291288 addons list                                                                                                                                   │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │ 24 Nov 25 09:39 UTC │
	│ addons  │ functional-291288 addons list -o json                                                                                                                           │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │ 24 Nov 25 09:39 UTC │
	└─────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/11/24 09:25:43
	Running on machine: ip-172-31-30-239
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1124 09:25:43.956868 1707070 out.go:360] Setting OutFile to fd 1 ...
	I1124 09:25:43.957002 1707070 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:25:43.957006 1707070 out.go:374] Setting ErrFile to fd 2...
	I1124 09:25:43.957010 1707070 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:25:43.957247 1707070 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
	I1124 09:25:43.957575 1707070 out.go:368] Setting JSON to false
	I1124 09:25:43.958421 1707070 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":29273,"bootTime":1763947071,"procs":157,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1124 09:25:43.958501 1707070 start.go:143] virtualization:  
	I1124 09:25:43.961954 1707070 out.go:179] * [functional-291288] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1124 09:25:43.965745 1707070 out.go:179]   - MINIKUBE_LOCATION=21978
	I1124 09:25:43.965806 1707070 notify.go:221] Checking for updates...
	I1124 09:25:43.971831 1707070 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1124 09:25:43.974596 1707070 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 09:25:43.977531 1707070 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21978-1652607/.minikube
	I1124 09:25:43.980447 1707070 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1124 09:25:43.983266 1707070 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1124 09:25:43.986897 1707070 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1124 09:25:43.986999 1707070 driver.go:422] Setting default libvirt URI to qemu:///system
	I1124 09:25:44.009686 1707070 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1124 09:25:44.009789 1707070 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 09:25:44.075505 1707070 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-11-24 09:25:44.065719192 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 09:25:44.075607 1707070 docker.go:319] overlay module found
	I1124 09:25:44.080493 1707070 out.go:179] * Using the docker driver based on existing profile
	I1124 09:25:44.083298 1707070 start.go:309] selected driver: docker
	I1124 09:25:44.083323 1707070 start.go:927] validating driver "docker" against &{Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disa
bleCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:25:44.083409 1707070 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1124 09:25:44.083513 1707070 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 09:25:44.137525 1707070 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-11-24 09:25:44.127840235 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 09:25:44.137959 1707070 start_flags.go:992] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1124 09:25:44.137984 1707070 cni.go:84] Creating CNI manager for ""
	I1124 09:25:44.138040 1707070 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1124 09:25:44.138097 1707070 start.go:353] cluster config:
	{Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local C
ontainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disab
leCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:25:44.143064 1707070 out.go:179] * Starting "functional-291288" primary control-plane node in "functional-291288" cluster
	I1124 09:25:44.145761 1707070 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1124 09:25:44.148578 1707070 out.go:179] * Pulling base image v0.0.48-1763789673-21948 ...
	I1124 09:25:44.151418 1707070 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1124 09:25:44.151496 1707070 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f in local docker daemon
	I1124 09:25:44.171581 1707070 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f in local docker daemon, skipping pull
	I1124 09:25:44.171593 1707070 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f exists in daemon, skipping load
	W1124 09:25:44.210575 1707070 preload.go:144] https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.35.0-beta.0/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	W1124 09:25:44.425167 1707070 preload.go:144] https://github.com/kubernetes-sigs/minikube-preloads/releases/download/v18/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	I1124 09:25:44.425335 1707070 profile.go:143] Saving config to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/config.json ...
	I1124 09:25:44.425459 1707070 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:25:44.425602 1707070 cache.go:243] Successfully downloaded all kic artifacts
	I1124 09:25:44.425631 1707070 start.go:360] acquireMachinesLock for functional-291288: {Name:mk85384dc057570e1f34db593d357cea738652c4 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.425681 1707070 start.go:364] duration metric: took 28.381µs to acquireMachinesLock for "functional-291288"
	I1124 09:25:44.425694 1707070 start.go:96] Skipping create...Using existing machine configuration
	I1124 09:25:44.425698 1707070 fix.go:54] fixHost starting: 
	I1124 09:25:44.425962 1707070 cli_runner.go:164] Run: docker container inspect functional-291288 --format={{.State.Status}}
	I1124 09:25:44.443478 1707070 fix.go:112] recreateIfNeeded on functional-291288: state=Running err=<nil>
	W1124 09:25:44.443512 1707070 fix.go:138] unexpected machine state, will restart: <nil>
	I1124 09:25:44.447296 1707070 out.go:252] * Updating the running docker "functional-291288" container ...
	I1124 09:25:44.447326 1707070 machine.go:94] provisionDockerMachine start ...
	I1124 09:25:44.447405 1707070 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:25:44.465953 1707070 main.go:143] libmachine: Using SSH client type: native
	I1124 09:25:44.466284 1707070 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34684 <nil> <nil>}
	I1124 09:25:44.466291 1707070 main.go:143] libmachine: About to run SSH command:
	hostname
	I1124 09:25:44.603673 1707070 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:25:44.618572 1707070 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-291288
	
	I1124 09:25:44.618586 1707070 ubuntu.go:182] provisioning hostname "functional-291288"
	I1124 09:25:44.618668 1707070 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:25:44.659382 1707070 main.go:143] libmachine: Using SSH client type: native
	I1124 09:25:44.659732 1707070 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34684 <nil> <nil>}
	I1124 09:25:44.659741 1707070 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-291288 && echo "functional-291288" | sudo tee /etc/hostname
	I1124 09:25:44.806505 1707070 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:25:44.844189 1707070 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-291288
	
	I1124 09:25:44.844281 1707070 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:25:44.868659 1707070 main.go:143] libmachine: Using SSH client type: native
	I1124 09:25:44.869019 1707070 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34684 <nil> <nil>}
	I1124 09:25:44.869041 1707070 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-291288' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-291288/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-291288' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1124 09:25:44.979106 1707070 cache.go:107] acquiring lock: {Name:mk22a10f0ce1f3295b61e7e76c455d0494a3e278 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.979193 1707070 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 exists
	I1124 09:25:44.979201 1707070 cache.go:96] cache image "gcr.io/k8s-minikube/storage-provisioner:v5" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5" took 127.862µs
	I1124 09:25:44.979207 1707070 cache.go:80] save to tar file gcr.io/k8s-minikube/storage-provisioner:v5 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 succeeded
	I1124 09:25:44.979198 1707070 cache.go:107] acquiring lock: {Name:mk80fdbe7cdb5bc17c2a82b4ecfd00214559a435 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.979218 1707070 cache.go:107] acquiring lock: {Name:mk85f1502dbb97830776608fb729eb3605e112e6 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.979237 1707070 cache.go:107] acquiring lock: {Name:mk46ce3b59d7e062b3dbc8a90fe5b4231f256471 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.979267 1707070 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 exists
	I1124 09:25:44.979266 1707070 cache.go:107] acquiring lock: {Name:mk1cf42e67442503a46c578224bd3cb68bf682d4 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.979273 1707070 cache.go:96] cache image "registry.k8s.io/pause:3.10.1" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1" took 55.992µs
	I1124 09:25:44.979277 1707070 cache.go:80] save to tar file registry.k8s.io/pause:3.10.1 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 succeeded
	I1124 09:25:44.979285 1707070 cache.go:107] acquiring lock: {Name:mk726502cb84c177b2e14fee88512325761511c5 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.979301 1707070 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 exists
	I1124 09:25:44.979310 1707070 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 exists
	I1124 09:25:44.979308 1707070 cache.go:96] cache image "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0" took 43.274µs
	I1124 09:25:44.979314 1707070 cache.go:96] cache image "registry.k8s.io/coredns/coredns:v1.13.1" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1" took 29.982µs
	I1124 09:25:44.979319 1707070 cache.go:80] save to tar file registry.k8s.io/coredns/coredns:v1.13.1 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 succeeded
	I1124 09:25:44.979319 1707070 cache.go:80] save to tar file registry.k8s.io/kube-apiserver:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 succeeded
	I1124 09:25:44.979326 1707070 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0 exists
	I1124 09:25:44.979330 1707070 cache.go:96] cache image "registry.k8s.io/etcd:3.5.24-0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0" took 94.392µs
	I1124 09:25:44.979336 1707070 cache.go:80] save to tar file registry.k8s.io/etcd:3.5.24-0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0 succeeded
	I1124 09:25:44.979330 1707070 cache.go:107] acquiring lock: {Name:mkfdc49c8e68aee34cee0c9d441ae8a4dca675c6 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.979345 1707070 cache.go:107] acquiring lock: {Name:mkdbf38e05e2c47c1a7a906a2236e9e7020a94c5 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.979364 1707070 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 exists
	I1124 09:25:44.979370 1707070 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 exists
	I1124 09:25:44.979368 1707070 cache.go:96] cache image "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0" took 40.427µs
	I1124 09:25:44.979373 1707070 cache.go:96] cache image "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0" took 29.49µs
	I1124 09:25:44.979375 1707070 cache.go:80] save to tar file registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 succeeded
	I1124 09:25:44.979378 1707070 cache.go:80] save to tar file registry.k8s.io/kube-scheduler:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 succeeded
	I1124 09:25:44.979407 1707070 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 exists
	I1124 09:25:44.979413 1707070 cache.go:96] cache image "registry.k8s.io/kube-proxy:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0" took 225.709µs
	I1124 09:25:44.979418 1707070 cache.go:80] save to tar file registry.k8s.io/kube-proxy:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 succeeded
	I1124 09:25:44.979424 1707070 cache.go:87] Successfully saved all images to host disk.
	I1124 09:25:45.028668 1707070 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1124 09:25:45.028686 1707070 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/21978-1652607/.minikube CaCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/21978-1652607/.minikube}
	I1124 09:25:45.028706 1707070 ubuntu.go:190] setting up certificates
	I1124 09:25:45.028727 1707070 provision.go:84] configureAuth start
	I1124 09:25:45.028800 1707070 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-291288
	I1124 09:25:45.083635 1707070 provision.go:143] copyHostCerts
	I1124 09:25:45.083709 1707070 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem, removing ...
	I1124 09:25:45.083718 1707070 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem
	I1124 09:25:45.083806 1707070 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem (1679 bytes)
	I1124 09:25:45.083920 1707070 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem, removing ...
	I1124 09:25:45.083924 1707070 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem
	I1124 09:25:45.083951 1707070 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem (1078 bytes)
	I1124 09:25:45.084006 1707070 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem, removing ...
	I1124 09:25:45.084009 1707070 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem
	I1124 09:25:45.084038 1707070 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem (1123 bytes)
	I1124 09:25:45.084083 1707070 provision.go:117] generating server cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem org=jenkins.functional-291288 san=[127.0.0.1 192.168.49.2 functional-291288 localhost minikube]
	I1124 09:25:45.498574 1707070 provision.go:177] copyRemoteCerts
	I1124 09:25:45.498637 1707070 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1124 09:25:45.498677 1707070 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:25:45.520187 1707070 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:25:45.626724 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1124 09:25:45.644660 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1124 09:25:45.663269 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1124 09:25:45.681392 1707070 provision.go:87] duration metric: took 652.643227ms to configureAuth
	I1124 09:25:45.681410 1707070 ubuntu.go:206] setting minikube options for container-runtime
	I1124 09:25:45.681611 1707070 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1124 09:25:45.681617 1707070 machine.go:97] duration metric: took 1.234286229s to provisionDockerMachine
	I1124 09:25:45.681624 1707070 start.go:293] postStartSetup for "functional-291288" (driver="docker")
	I1124 09:25:45.681634 1707070 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1124 09:25:45.681687 1707070 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1124 09:25:45.681727 1707070 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:25:45.698790 1707070 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:25:45.802503 1707070 ssh_runner.go:195] Run: cat /etc/os-release
	I1124 09:25:45.805922 1707070 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1124 09:25:45.805944 1707070 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1124 09:25:45.805954 1707070 filesync.go:126] Scanning /home/jenkins/minikube-integration/21978-1652607/.minikube/addons for local assets ...
	I1124 09:25:45.806011 1707070 filesync.go:126] Scanning /home/jenkins/minikube-integration/21978-1652607/.minikube/files for local assets ...
	I1124 09:25:45.806087 1707070 filesync.go:149] local asset: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem -> 16544672.pem in /etc/ssl/certs
	I1124 09:25:45.806167 1707070 filesync.go:149] local asset: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/test/nested/copy/1654467/hosts -> hosts in /etc/test/nested/copy/1654467
	I1124 09:25:45.806257 1707070 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/1654467
	I1124 09:25:45.814093 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem --> /etc/ssl/certs/16544672.pem (1708 bytes)
	I1124 09:25:45.832308 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/test/nested/copy/1654467/hosts --> /etc/test/nested/copy/1654467/hosts (40 bytes)
	I1124 09:25:45.850625 1707070 start.go:296] duration metric: took 168.9873ms for postStartSetup
	I1124 09:25:45.850696 1707070 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1124 09:25:45.850734 1707070 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:25:45.868479 1707070 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:25:45.971382 1707070 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1124 09:25:45.976655 1707070 fix.go:56] duration metric: took 1.550948262s for fixHost
	I1124 09:25:45.976671 1707070 start.go:83] releasing machines lock for "functional-291288", held for 1.550982815s
	I1124 09:25:45.976739 1707070 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-291288
	I1124 09:25:45.997505 1707070 ssh_runner.go:195] Run: cat /version.json
	I1124 09:25:45.997527 1707070 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1124 09:25:45.997550 1707070 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:25:45.997588 1707070 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:25:46.017321 1707070 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:25:46.018732 1707070 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:25:46.118131 1707070 ssh_runner.go:195] Run: systemctl --version
	I1124 09:25:46.213854 1707070 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1124 09:25:46.218087 1707070 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1124 09:25:46.218149 1707070 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1124 09:25:46.225944 1707070 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1124 09:25:46.225958 1707070 start.go:496] detecting cgroup driver to use...
	I1124 09:25:46.225989 1707070 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1124 09:25:46.226035 1707070 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1124 09:25:46.241323 1707070 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1124 09:25:46.254720 1707070 docker.go:218] disabling cri-docker service (if available) ...
	I1124 09:25:46.254789 1707070 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1124 09:25:46.270340 1707070 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1124 09:25:46.283549 1707070 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1124 09:25:46.399926 1707070 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1124 09:25:46.515234 1707070 docker.go:234] disabling docker service ...
	I1124 09:25:46.515290 1707070 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1124 09:25:46.529899 1707070 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1124 09:25:46.543047 1707070 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1124 09:25:46.658532 1707070 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1124 09:25:46.775880 1707070 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1124 09:25:46.790551 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1124 09:25:46.806411 1707070 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:25:46.967053 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1124 09:25:46.977583 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1124 09:25:46.986552 1707070 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1124 09:25:46.986618 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1124 09:25:46.995635 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1124 09:25:47.005680 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1124 09:25:47.015425 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1124 09:25:47.024808 1707070 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1124 09:25:47.033022 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1124 09:25:47.041980 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1124 09:25:47.051362 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1124 09:25:47.060469 1707070 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1124 09:25:47.068004 1707070 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1124 09:25:47.075326 1707070 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1124 09:25:47.191217 1707070 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1124 09:25:47.313892 1707070 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1124 09:25:47.313955 1707070 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1124 09:25:47.318001 1707070 start.go:564] Will wait 60s for crictl version
	I1124 09:25:47.318060 1707070 ssh_runner.go:195] Run: which crictl
	I1124 09:25:47.321766 1707070 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1124 09:25:47.347974 1707070 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.1.5
	RuntimeApiVersion:  v1
	I1124 09:25:47.348042 1707070 ssh_runner.go:195] Run: containerd --version
	I1124 09:25:47.369074 1707070 ssh_runner.go:195] Run: containerd --version
	I1124 09:25:47.394675 1707070 out.go:179] * Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	I1124 09:25:47.397593 1707070 cli_runner.go:164] Run: docker network inspect functional-291288 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1124 09:25:47.412872 1707070 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1124 09:25:47.419437 1707070 out.go:179]   - apiserver.enable-admission-plugins=NamespaceAutoProvision
	I1124 09:25:47.422135 1707070 kubeadm.go:884] updating cluster {Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikub
eCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker Bina
ryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1124 09:25:47.422352 1707070 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:25:47.578507 1707070 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:25:47.745390 1707070 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:25:47.894887 1707070 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1124 09:25:47.894982 1707070 ssh_runner.go:195] Run: sudo crictl images --output json
	I1124 09:25:47.919585 1707070 containerd.go:627] all images are preloaded for containerd runtime.
	I1124 09:25:47.919604 1707070 cache_images.go:86] Images are preloaded, skipping loading
	I1124 09:25:47.919612 1707070 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-beta.0 containerd true true} ...
	I1124 09:25:47.919707 1707070 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-291288 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1124 09:25:47.919778 1707070 ssh_runner.go:195] Run: sudo crictl info
	I1124 09:25:47.948265 1707070 extraconfig.go:125] Overwriting default enable-admission-plugins=NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota with user provided enable-admission-plugins=NamespaceAutoProvision for component apiserver
	I1124 09:25:47.948285 1707070 cni.go:84] Creating CNI manager for ""
	I1124 09:25:47.948293 1707070 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1124 09:25:47.948308 1707070 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1124 09:25:47.948331 1707070 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-beta.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-291288 NodeName:functional-291288 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceAutoProvision] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false Kubel
etConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1124 09:25:47.948441 1707070 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-291288"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceAutoProvision"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-beta.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1124 09:25:47.948507 1707070 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0
	I1124 09:25:47.956183 1707070 binaries.go:51] Found k8s binaries, skipping transfer
	I1124 09:25:47.956246 1707070 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1124 09:25:47.963641 1707070 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (328 bytes)
	I1124 09:25:47.976586 1707070 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (359 bytes)
	I1124 09:25:47.989056 1707070 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2087 bytes)
	I1124 09:25:48.003961 1707070 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1124 09:25:48.011533 1707070 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1124 09:25:48.134407 1707070 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1124 09:25:48.383061 1707070 certs.go:69] Setting up /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288 for IP: 192.168.49.2
	I1124 09:25:48.383072 1707070 certs.go:195] generating shared ca certs ...
	I1124 09:25:48.383086 1707070 certs.go:227] acquiring lock for ca certs: {Name:mkbe540a30c4376a351176f7fe6fec044d058b09 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 09:25:48.383238 1707070 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.key
	I1124 09:25:48.383279 1707070 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.key
	I1124 09:25:48.383286 1707070 certs.go:257] generating profile certs ...
	I1124 09:25:48.383366 1707070 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.key
	I1124 09:25:48.383420 1707070 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.key.5acb2515
	I1124 09:25:48.383456 1707070 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.key
	I1124 09:25:48.383562 1707070 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467.pem (1338 bytes)
	W1124 09:25:48.383598 1707070 certs.go:480] ignoring /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467_empty.pem, impossibly tiny 0 bytes
	I1124 09:25:48.383605 1707070 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem (1671 bytes)
	I1124 09:25:48.383632 1707070 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem (1078 bytes)
	I1124 09:25:48.383655 1707070 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem (1123 bytes)
	I1124 09:25:48.383684 1707070 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem (1679 bytes)
	I1124 09:25:48.383730 1707070 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem (1708 bytes)
	I1124 09:25:48.384294 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1124 09:25:48.403533 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1124 09:25:48.421212 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1124 09:25:48.441887 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1124 09:25:48.462311 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1124 09:25:48.480889 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1124 09:25:48.499086 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1124 09:25:48.517112 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1124 09:25:48.535554 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem --> /usr/share/ca-certificates/16544672.pem (1708 bytes)
	I1124 09:25:48.553310 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1124 09:25:48.571447 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467.pem --> /usr/share/ca-certificates/1654467.pem (1338 bytes)
	I1124 09:25:48.589094 1707070 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1124 09:25:48.602393 1707070 ssh_runner.go:195] Run: openssl version
	I1124 09:25:48.608953 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/16544672.pem && ln -fs /usr/share/ca-certificates/16544672.pem /etc/ssl/certs/16544672.pem"
	I1124 09:25:48.617886 1707070 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/16544672.pem
	I1124 09:25:48.621697 1707070 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Nov 24 09:10 /usr/share/ca-certificates/16544672.pem
	I1124 09:25:48.621756 1707070 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/16544672.pem
	I1124 09:25:48.663214 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/16544672.pem /etc/ssl/certs/3ec20f2e.0"
	I1124 09:25:48.671328 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1124 09:25:48.679977 1707070 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:25:48.683961 1707070 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Nov 24 08:44 /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:25:48.684024 1707070 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:25:48.725273 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1124 09:25:48.733278 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1654467.pem && ln -fs /usr/share/ca-certificates/1654467.pem /etc/ssl/certs/1654467.pem"
	I1124 09:25:48.741887 1707070 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1654467.pem
	I1124 09:25:48.745440 1707070 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Nov 24 09:10 /usr/share/ca-certificates/1654467.pem
	I1124 09:25:48.745500 1707070 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1654467.pem
	I1124 09:25:48.791338 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1654467.pem /etc/ssl/certs/51391683.0"
	I1124 09:25:48.799503 1707070 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1124 09:25:48.803145 1707070 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1124 09:25:48.844016 1707070 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1124 09:25:48.884962 1707070 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1124 09:25:48.926044 1707070 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1124 09:25:48.967289 1707070 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1124 09:25:49.008697 1707070 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1124 09:25:49.049934 1707070 kubeadm.go:401] StartCluster: {Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikubeCA
APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryM
irror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:25:49.050012 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1124 09:25:49.050074 1707070 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1124 09:25:49.080420 1707070 cri.go:89] found id: ""
	I1124 09:25:49.080484 1707070 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1124 09:25:49.088364 1707070 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1124 09:25:49.088374 1707070 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1124 09:25:49.088425 1707070 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1124 09:25:49.095680 1707070 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1124 09:25:49.096194 1707070 kubeconfig.go:125] found "functional-291288" server: "https://192.168.49.2:8441"
	I1124 09:25:49.097500 1707070 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1124 09:25:49.105267 1707070 kubeadm.go:645] detected kubeadm config drift (will reconfigure cluster from new /var/tmp/minikube/kubeadm.yaml):
	-- stdout --
	--- /var/tmp/minikube/kubeadm.yaml	2025-11-24 09:11:10.138797725 +0000
	+++ /var/tmp/minikube/kubeadm.yaml.new	2025-11-24 09:25:47.995648074 +0000
	@@ -24,7 +24,7 @@
	   certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	   extraArgs:
	     - name: "enable-admission-plugins"
	-      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	+      value: "NamespaceAutoProvision"
	 controllerManager:
	   extraArgs:
	     - name: "allocate-node-cidrs"
	
	-- /stdout --
	I1124 09:25:49.105285 1707070 kubeadm.go:1161] stopping kube-system containers ...
	I1124 09:25:49.105296 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name: Namespaces:[kube-system]}
	I1124 09:25:49.105351 1707070 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1124 09:25:49.142256 1707070 cri.go:89] found id: ""
	I1124 09:25:49.142317 1707070 ssh_runner.go:195] Run: sudo systemctl stop kubelet
	I1124 09:25:49.162851 1707070 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1124 09:25:49.170804 1707070 kubeadm.go:158] found existing configuration files:
	-rw------- 1 root root 5635 Nov 24 09:15 /etc/kubernetes/admin.conf
	-rw------- 1 root root 5636 Nov 24 09:15 /etc/kubernetes/controller-manager.conf
	-rw------- 1 root root 5672 Nov 24 09:15 /etc/kubernetes/kubelet.conf
	-rw------- 1 root root 5584 Nov 24 09:15 /etc/kubernetes/scheduler.conf
	
	I1124 09:25:49.170876 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1124 09:25:49.178603 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1124 09:25:49.185907 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1124 09:25:49.185964 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1124 09:25:49.193453 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1124 09:25:49.200815 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1124 09:25:49.200869 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1124 09:25:49.208328 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1124 09:25:49.215968 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1124 09:25:49.216025 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1124 09:25:49.223400 1707070 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1124 09:25:49.230953 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase certs all --config /var/tmp/minikube/kubeadm.yaml"
	I1124 09:25:49.277779 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml"
	I1124 09:25:50.308934 1707070 ssh_runner.go:235] Completed: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml": (1.031131442s)
	I1124 09:25:50.308993 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubelet-start --config /var/tmp/minikube/kubeadm.yaml"
	I1124 09:25:50.511648 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase control-plane all --config /var/tmp/minikube/kubeadm.yaml"
	I1124 09:25:50.576653 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase etcd local --config /var/tmp/minikube/kubeadm.yaml"
	I1124 09:25:50.625775 1707070 api_server.go:52] waiting for apiserver process to appear ...
	I1124 09:25:50.625855 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:51.126713 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:51.625939 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:52.126677 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:52.626053 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:53.126113 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:53.626972 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:54.126493 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:54.626036 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:55.126171 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:55.626853 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:56.126041 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:56.626177 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:57.126019 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:57.626847 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:58.126017 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:58.626716 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:59.125997 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:59.626367 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:00.125951 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:00.626013 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:01.126844 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:01.626038 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:02.126420 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:02.626727 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:03.126582 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:03.626068 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:04.126304 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:04.626830 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:05.126754 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:05.625961 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:06.126197 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:06.626039 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:07.126915 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:07.626052 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:08.126281 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:08.626116 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:09.126574 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:09.626068 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:10.125978 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:10.626328 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:11.126416 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:11.626073 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:12.126027 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:12.626174 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:13.126044 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:13.626781 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:14.126849 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:14.626203 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:15.125957 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:15.626068 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:16.126934 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:16.626382 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:17.126245 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:17.626034 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:18.126745 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:18.626942 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:19.126393 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:19.626607 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:20.126050 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:20.626732 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:21.126049 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:21.626115 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:22.125988 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:22.626261 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:23.126293 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:23.626107 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:24.126971 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:24.626009 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:25.126859 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:25.626876 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:26.126041 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:26.625983 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:27.126168 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:27.626079 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:28.126047 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:28.626761 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:29.126598 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:29.626290 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:30.125941 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:30.626102 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:31.126717 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:31.626588 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:32.126223 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:32.626875 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:33.126051 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:33.625963 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:34.126808 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:34.626621 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:35.126147 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:35.626018 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:36.126039 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:36.625970 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:37.126579 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:37.626198 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:38.126718 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:38.626386 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:39.126159 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:39.626590 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:40.126050 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:40.626422 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:41.126600 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:41.626097 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:42.127732 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:42.626108 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:43.126855 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:43.626202 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:44.126380 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:44.626423 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:45.127019 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:45.626257 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:46.125911 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:46.626125 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:47.126026 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:47.626915 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:48.126322 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:48.626706 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:49.126864 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:49.627009 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:50.126375 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:50.626418 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:26:50.626521 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:26:50.654529 1707070 cri.go:89] found id: ""
	I1124 09:26:50.654543 1707070 logs.go:282] 0 containers: []
	W1124 09:26:50.654550 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:26:50.654555 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:26:50.654624 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:26:50.683038 1707070 cri.go:89] found id: ""
	I1124 09:26:50.683052 1707070 logs.go:282] 0 containers: []
	W1124 09:26:50.683059 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:26:50.683064 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:26:50.683121 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:26:50.711396 1707070 cri.go:89] found id: ""
	I1124 09:26:50.711410 1707070 logs.go:282] 0 containers: []
	W1124 09:26:50.711422 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:26:50.711433 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:26:50.711498 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:26:50.735435 1707070 cri.go:89] found id: ""
	I1124 09:26:50.735449 1707070 logs.go:282] 0 containers: []
	W1124 09:26:50.735457 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:26:50.735463 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:26:50.735520 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:26:50.760437 1707070 cri.go:89] found id: ""
	I1124 09:26:50.760451 1707070 logs.go:282] 0 containers: []
	W1124 09:26:50.760458 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:26:50.760464 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:26:50.760520 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:26:50.785555 1707070 cri.go:89] found id: ""
	I1124 09:26:50.785576 1707070 logs.go:282] 0 containers: []
	W1124 09:26:50.785584 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:26:50.785590 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:26:50.785662 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:26:50.810261 1707070 cri.go:89] found id: ""
	I1124 09:26:50.810278 1707070 logs.go:282] 0 containers: []
	W1124 09:26:50.810286 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:26:50.810294 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:26:50.810305 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:26:50.879322 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:26:50.870488   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:50.871030   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:50.872890   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:50.873352   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:50.875005   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:26:50.870488   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:50.871030   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:50.872890   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:50.873352   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:50.875005   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:26:50.879334 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:26:50.879345 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:26:50.941117 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:26:50.941140 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:26:50.969259 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:26:50.969275 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:26:51.024741 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:26:51.024763 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:26:53.542977 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:53.553083 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:26:53.553155 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:26:53.577781 1707070 cri.go:89] found id: ""
	I1124 09:26:53.577795 1707070 logs.go:282] 0 containers: []
	W1124 09:26:53.577802 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:26:53.577808 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:26:53.577866 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:26:53.604191 1707070 cri.go:89] found id: ""
	I1124 09:26:53.604205 1707070 logs.go:282] 0 containers: []
	W1124 09:26:53.604212 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:26:53.604217 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:26:53.604277 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:26:53.632984 1707070 cri.go:89] found id: ""
	I1124 09:26:53.632998 1707070 logs.go:282] 0 containers: []
	W1124 09:26:53.633004 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:26:53.633010 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:26:53.633071 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:26:53.663828 1707070 cri.go:89] found id: ""
	I1124 09:26:53.663842 1707070 logs.go:282] 0 containers: []
	W1124 09:26:53.663850 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:26:53.663856 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:26:53.663912 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:26:53.695173 1707070 cri.go:89] found id: ""
	I1124 09:26:53.695187 1707070 logs.go:282] 0 containers: []
	W1124 09:26:53.695195 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:26:53.695200 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:26:53.695259 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:26:53.719882 1707070 cri.go:89] found id: ""
	I1124 09:26:53.719897 1707070 logs.go:282] 0 containers: []
	W1124 09:26:53.719904 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:26:53.719910 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:26:53.719993 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:26:53.753006 1707070 cri.go:89] found id: ""
	I1124 09:26:53.753020 1707070 logs.go:282] 0 containers: []
	W1124 09:26:53.753038 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:26:53.753046 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:26:53.753057 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:26:53.810839 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:26:53.810864 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:26:53.828132 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:26:53.828149 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:26:53.893802 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:26:53.885327   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:53.886130   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:53.888016   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:53.888539   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:53.890056   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:26:53.885327   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:53.886130   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:53.888016   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:53.888539   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:53.890056   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:26:53.893815 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:26:53.893825 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:26:53.955840 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:26:53.955860 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:26:56.485625 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:56.495752 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:26:56.495812 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:26:56.523600 1707070 cri.go:89] found id: ""
	I1124 09:26:56.523614 1707070 logs.go:282] 0 containers: []
	W1124 09:26:56.523622 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:26:56.523627 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:26:56.523730 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:26:56.547432 1707070 cri.go:89] found id: ""
	I1124 09:26:56.547445 1707070 logs.go:282] 0 containers: []
	W1124 09:26:56.547453 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:26:56.547465 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:26:56.547522 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:26:56.571895 1707070 cri.go:89] found id: ""
	I1124 09:26:56.571909 1707070 logs.go:282] 0 containers: []
	W1124 09:26:56.571917 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:26:56.571922 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:26:56.571977 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:26:56.596624 1707070 cri.go:89] found id: ""
	I1124 09:26:56.596637 1707070 logs.go:282] 0 containers: []
	W1124 09:26:56.596644 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:26:56.596650 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:26:56.596705 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:26:56.621497 1707070 cri.go:89] found id: ""
	I1124 09:26:56.621511 1707070 logs.go:282] 0 containers: []
	W1124 09:26:56.621518 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:26:56.621523 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:26:56.621588 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:26:56.656808 1707070 cri.go:89] found id: ""
	I1124 09:26:56.656822 1707070 logs.go:282] 0 containers: []
	W1124 09:26:56.656829 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:26:56.656834 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:26:56.656891 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:26:56.693750 1707070 cri.go:89] found id: ""
	I1124 09:26:56.693763 1707070 logs.go:282] 0 containers: []
	W1124 09:26:56.693770 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:26:56.693778 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:26:56.693799 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:26:56.711624 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:26:56.711642 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:26:56.772006 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:26:56.764543   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:56.764946   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:56.766216   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:56.766780   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:56.768382   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:26:56.764543   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:56.764946   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:56.766216   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:56.766780   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:56.768382   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:26:56.772020 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:26:56.772030 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:26:56.832784 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:26:56.832805 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:26:56.862164 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:26:56.862179 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:26:59.417328 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:59.427445 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:26:59.427506 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:26:59.451539 1707070 cri.go:89] found id: ""
	I1124 09:26:59.451574 1707070 logs.go:282] 0 containers: []
	W1124 09:26:59.451582 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:26:59.451588 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:26:59.451647 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:26:59.476110 1707070 cri.go:89] found id: ""
	I1124 09:26:59.476124 1707070 logs.go:282] 0 containers: []
	W1124 09:26:59.476131 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:26:59.476137 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:26:59.476194 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:26:59.504520 1707070 cri.go:89] found id: ""
	I1124 09:26:59.504533 1707070 logs.go:282] 0 containers: []
	W1124 09:26:59.504540 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:26:59.504546 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:26:59.504607 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:26:59.529647 1707070 cri.go:89] found id: ""
	I1124 09:26:59.529662 1707070 logs.go:282] 0 containers: []
	W1124 09:26:59.529669 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:26:59.529674 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:26:59.529753 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:26:59.558904 1707070 cri.go:89] found id: ""
	I1124 09:26:59.558918 1707070 logs.go:282] 0 containers: []
	W1124 09:26:59.558925 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:26:59.558930 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:26:59.558999 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:26:59.583698 1707070 cri.go:89] found id: ""
	I1124 09:26:59.583712 1707070 logs.go:282] 0 containers: []
	W1124 09:26:59.583733 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:26:59.583738 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:26:59.583800 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:26:59.607605 1707070 cri.go:89] found id: ""
	I1124 09:26:59.607619 1707070 logs.go:282] 0 containers: []
	W1124 09:26:59.607626 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:26:59.607634 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:26:59.607645 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:26:59.624446 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:26:59.624462 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:26:59.711588 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:26:59.701837   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:59.703242   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:59.704228   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:59.706009   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:59.706513   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:26:59.701837   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:59.703242   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:59.704228   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:59.706009   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:59.706513   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:26:59.711600 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:26:59.711610 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:26:59.777617 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:26:59.777638 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:26:59.810868 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:26:59.810888 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:02.368395 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:02.379444 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:02.379503 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:02.403995 1707070 cri.go:89] found id: ""
	I1124 09:27:02.404009 1707070 logs.go:282] 0 containers: []
	W1124 09:27:02.404017 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:02.404022 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:02.404080 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:02.428532 1707070 cri.go:89] found id: ""
	I1124 09:27:02.428546 1707070 logs.go:282] 0 containers: []
	W1124 09:27:02.428553 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:02.428559 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:02.428623 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:02.455148 1707070 cri.go:89] found id: ""
	I1124 09:27:02.455162 1707070 logs.go:282] 0 containers: []
	W1124 09:27:02.455169 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:02.455174 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:02.455233 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:02.479942 1707070 cri.go:89] found id: ""
	I1124 09:27:02.479957 1707070 logs.go:282] 0 containers: []
	W1124 09:27:02.479969 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:02.479975 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:02.480034 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:02.505728 1707070 cri.go:89] found id: ""
	I1124 09:27:02.505744 1707070 logs.go:282] 0 containers: []
	W1124 09:27:02.505751 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:02.505760 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:02.505845 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:02.536863 1707070 cri.go:89] found id: ""
	I1124 09:27:02.536881 1707070 logs.go:282] 0 containers: []
	W1124 09:27:02.536889 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:02.536894 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:02.536960 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:02.566083 1707070 cri.go:89] found id: ""
	I1124 09:27:02.566107 1707070 logs.go:282] 0 containers: []
	W1124 09:27:02.566124 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:02.566132 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:02.566142 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:02.628402 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:02.628423 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:02.669505 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:02.669523 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:02.737879 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:02.737907 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:02.755317 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:02.755334 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:02.820465 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:02.811248   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:02.812608   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:02.813513   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:02.815318   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:02.815727   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:02.811248   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:02.812608   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:02.813513   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:02.815318   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:02.815727   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:05.320749 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:05.331020 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:05.331081 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:05.355889 1707070 cri.go:89] found id: ""
	I1124 09:27:05.355904 1707070 logs.go:282] 0 containers: []
	W1124 09:27:05.355912 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:05.355917 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:05.355980 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:05.381650 1707070 cri.go:89] found id: ""
	I1124 09:27:05.381664 1707070 logs.go:282] 0 containers: []
	W1124 09:27:05.381671 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:05.381676 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:05.381733 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:05.410311 1707070 cri.go:89] found id: ""
	I1124 09:27:05.410325 1707070 logs.go:282] 0 containers: []
	W1124 09:27:05.410332 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:05.410337 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:05.410396 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:05.434601 1707070 cri.go:89] found id: ""
	I1124 09:27:05.434615 1707070 logs.go:282] 0 containers: []
	W1124 09:27:05.434621 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:05.434627 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:05.434684 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:05.459196 1707070 cri.go:89] found id: ""
	I1124 09:27:05.459210 1707070 logs.go:282] 0 containers: []
	W1124 09:27:05.459218 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:05.459223 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:05.459294 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:05.483433 1707070 cri.go:89] found id: ""
	I1124 09:27:05.483448 1707070 logs.go:282] 0 containers: []
	W1124 09:27:05.483455 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:05.483460 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:05.483523 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:05.508072 1707070 cri.go:89] found id: ""
	I1124 09:27:05.508086 1707070 logs.go:282] 0 containers: []
	W1124 09:27:05.508093 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:05.508101 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:05.508111 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:05.563733 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:05.563752 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:05.584705 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:05.584736 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:05.666380 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:05.657873   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:05.658740   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:05.660432   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:05.660828   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:05.662363   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:05.657873   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:05.658740   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:05.660432   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:05.660828   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:05.662363   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:05.666394 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:05.666405 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:05.738526 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:05.738548 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:08.268404 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:08.278347 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:08.278408 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:08.303562 1707070 cri.go:89] found id: ""
	I1124 09:27:08.303577 1707070 logs.go:282] 0 containers: []
	W1124 09:27:08.303585 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:08.303590 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:08.303651 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:08.329886 1707070 cri.go:89] found id: ""
	I1124 09:27:08.329900 1707070 logs.go:282] 0 containers: []
	W1124 09:27:08.329907 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:08.329913 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:08.329971 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:08.355081 1707070 cri.go:89] found id: ""
	I1124 09:27:08.355096 1707070 logs.go:282] 0 containers: []
	W1124 09:27:08.355104 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:08.355110 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:08.355175 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:08.381511 1707070 cri.go:89] found id: ""
	I1124 09:27:08.381534 1707070 logs.go:282] 0 containers: []
	W1124 09:27:08.381543 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:08.381549 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:08.381620 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:08.410606 1707070 cri.go:89] found id: ""
	I1124 09:27:08.410629 1707070 logs.go:282] 0 containers: []
	W1124 09:27:08.410637 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:08.410642 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:08.410700 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:08.434980 1707070 cri.go:89] found id: ""
	I1124 09:27:08.434994 1707070 logs.go:282] 0 containers: []
	W1124 09:27:08.435001 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:08.435007 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:08.435064 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:08.463780 1707070 cri.go:89] found id: ""
	I1124 09:27:08.463793 1707070 logs.go:282] 0 containers: []
	W1124 09:27:08.463800 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:08.463808 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:08.463819 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:08.527201 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:08.518614   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:08.519320   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:08.521220   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:08.521832   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:08.523649   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:08.518614   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:08.519320   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:08.521220   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:08.521832   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:08.523649   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:08.527213 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:08.527223 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:08.591559 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:08.591581 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:08.619107 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:08.619125 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:08.678658 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:08.678675 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:11.199028 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:11.209463 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:11.209529 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:11.236040 1707070 cri.go:89] found id: ""
	I1124 09:27:11.236061 1707070 logs.go:282] 0 containers: []
	W1124 09:27:11.236069 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:11.236075 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:11.236145 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:11.263895 1707070 cri.go:89] found id: ""
	I1124 09:27:11.263906 1707070 logs.go:282] 0 containers: []
	W1124 09:27:11.263912 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:11.263917 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:11.263968 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:11.290492 1707070 cri.go:89] found id: ""
	I1124 09:27:11.290507 1707070 logs.go:282] 0 containers: []
	W1124 09:27:11.290514 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:11.290519 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:11.290575 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:11.316763 1707070 cri.go:89] found id: ""
	I1124 09:27:11.316778 1707070 logs.go:282] 0 containers: []
	W1124 09:27:11.316785 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:11.316791 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:11.316899 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:11.340653 1707070 cri.go:89] found id: ""
	I1124 09:27:11.340668 1707070 logs.go:282] 0 containers: []
	W1124 09:27:11.340675 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:11.340680 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:11.340741 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:11.365000 1707070 cri.go:89] found id: ""
	I1124 09:27:11.365013 1707070 logs.go:282] 0 containers: []
	W1124 09:27:11.365020 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:11.365026 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:11.365086 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:11.393012 1707070 cri.go:89] found id: ""
	I1124 09:27:11.393025 1707070 logs.go:282] 0 containers: []
	W1124 09:27:11.393033 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:11.393041 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:11.393053 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:11.409740 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:11.409758 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:11.474068 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:11.465242   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:11.466095   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:11.467959   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:11.468588   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:11.470448   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:11.465242   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:11.466095   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:11.467959   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:11.468588   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:11.470448   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:11.474079 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:11.474089 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:11.535411 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:11.535433 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:11.565626 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:11.565645 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:14.123823 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:14.133770 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:14.133829 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:14.157476 1707070 cri.go:89] found id: ""
	I1124 09:27:14.157490 1707070 logs.go:282] 0 containers: []
	W1124 09:27:14.157497 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:14.157503 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:14.157562 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:14.188747 1707070 cri.go:89] found id: ""
	I1124 09:27:14.188761 1707070 logs.go:282] 0 containers: []
	W1124 09:27:14.188768 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:14.188773 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:14.188830 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:14.216257 1707070 cri.go:89] found id: ""
	I1124 09:27:14.216271 1707070 logs.go:282] 0 containers: []
	W1124 09:27:14.216279 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:14.216284 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:14.216345 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:14.241336 1707070 cri.go:89] found id: ""
	I1124 09:27:14.241349 1707070 logs.go:282] 0 containers: []
	W1124 09:27:14.241357 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:14.241362 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:14.241423 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:14.265223 1707070 cri.go:89] found id: ""
	I1124 09:27:14.265238 1707070 logs.go:282] 0 containers: []
	W1124 09:27:14.265245 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:14.265250 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:14.265312 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:14.292087 1707070 cri.go:89] found id: ""
	I1124 09:27:14.292101 1707070 logs.go:282] 0 containers: []
	W1124 09:27:14.292108 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:14.292114 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:14.292171 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:14.316839 1707070 cri.go:89] found id: ""
	I1124 09:27:14.316854 1707070 logs.go:282] 0 containers: []
	W1124 09:27:14.316861 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:14.316869 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:14.316879 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:14.371692 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:14.371715 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:14.388964 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:14.388980 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:14.455069 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:14.447375   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:14.448018   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:14.449517   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:14.449819   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:14.451683   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:14.447375   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:14.448018   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:14.449517   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:14.449819   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:14.451683   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:14.455080 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:14.455090 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:14.518102 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:14.518124 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:17.045537 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:17.055937 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:17.056004 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:17.084357 1707070 cri.go:89] found id: ""
	I1124 09:27:17.084370 1707070 logs.go:282] 0 containers: []
	W1124 09:27:17.084378 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:17.084383 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:17.084439 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:17.112022 1707070 cri.go:89] found id: ""
	I1124 09:27:17.112035 1707070 logs.go:282] 0 containers: []
	W1124 09:27:17.112043 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:17.112048 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:17.112110 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:17.135317 1707070 cri.go:89] found id: ""
	I1124 09:27:17.135331 1707070 logs.go:282] 0 containers: []
	W1124 09:27:17.135338 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:17.135343 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:17.135399 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:17.163850 1707070 cri.go:89] found id: ""
	I1124 09:27:17.163865 1707070 logs.go:282] 0 containers: []
	W1124 09:27:17.163872 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:17.163878 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:17.163933 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:17.188915 1707070 cri.go:89] found id: ""
	I1124 09:27:17.188929 1707070 logs.go:282] 0 containers: []
	W1124 09:27:17.188936 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:17.188941 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:17.188997 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:17.217448 1707070 cri.go:89] found id: ""
	I1124 09:27:17.217461 1707070 logs.go:282] 0 containers: []
	W1124 09:27:17.217475 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:17.217480 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:17.217537 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:17.242521 1707070 cri.go:89] found id: ""
	I1124 09:27:17.242536 1707070 logs.go:282] 0 containers: []
	W1124 09:27:17.242543 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:17.242551 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:17.242561 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:17.297899 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:17.297921 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:17.315278 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:17.315297 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:17.377620 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:17.368489   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:17.368893   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:17.370596   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:17.371050   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:17.372486   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:17.368489   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:17.368893   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:17.370596   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:17.371050   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:17.372486   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:17.377640 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:17.377651 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:17.439884 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:17.439907 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:19.969337 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:19.979536 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:19.979595 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:20.018198 1707070 cri.go:89] found id: ""
	I1124 09:27:20.018220 1707070 logs.go:282] 0 containers: []
	W1124 09:27:20.018229 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:20.018235 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:20.018297 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:20.046055 1707070 cri.go:89] found id: ""
	I1124 09:27:20.046070 1707070 logs.go:282] 0 containers: []
	W1124 09:27:20.046077 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:20.046082 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:20.046158 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:20.078159 1707070 cri.go:89] found id: ""
	I1124 09:27:20.078183 1707070 logs.go:282] 0 containers: []
	W1124 09:27:20.078191 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:20.078197 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:20.078289 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:20.104136 1707070 cri.go:89] found id: ""
	I1124 09:27:20.104151 1707070 logs.go:282] 0 containers: []
	W1124 09:27:20.104158 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:20.104164 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:20.104228 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:20.130266 1707070 cri.go:89] found id: ""
	I1124 09:27:20.130280 1707070 logs.go:282] 0 containers: []
	W1124 09:27:20.130288 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:20.130293 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:20.130352 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:20.156899 1707070 cri.go:89] found id: ""
	I1124 09:27:20.156913 1707070 logs.go:282] 0 containers: []
	W1124 09:27:20.156921 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:20.156926 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:20.156986 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:20.182706 1707070 cri.go:89] found id: ""
	I1124 09:27:20.182721 1707070 logs.go:282] 0 containers: []
	W1124 09:27:20.182728 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:20.182736 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:20.182747 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:20.240720 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:20.240740 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:20.257971 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:20.257987 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:20.324806 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:20.316231   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:20.316929   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:20.317881   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:20.319464   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:20.319918   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:20.316231   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:20.316929   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:20.317881   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:20.319464   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:20.319918   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:20.324827 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:20.324838 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:20.386188 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:20.386212 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:22.915679 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:22.927190 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:22.927254 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:22.959235 1707070 cri.go:89] found id: ""
	I1124 09:27:22.959249 1707070 logs.go:282] 0 containers: []
	W1124 09:27:22.959256 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:22.959262 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:22.959318 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:22.986124 1707070 cri.go:89] found id: ""
	I1124 09:27:22.986138 1707070 logs.go:282] 0 containers: []
	W1124 09:27:22.986146 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:22.986151 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:22.986206 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:23.016094 1707070 cri.go:89] found id: ""
	I1124 09:27:23.016108 1707070 logs.go:282] 0 containers: []
	W1124 09:27:23.016116 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:23.016121 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:23.016183 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:23.044417 1707070 cri.go:89] found id: ""
	I1124 09:27:23.044431 1707070 logs.go:282] 0 containers: []
	W1124 09:27:23.044439 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:23.044444 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:23.044501 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:23.069468 1707070 cri.go:89] found id: ""
	I1124 09:27:23.069484 1707070 logs.go:282] 0 containers: []
	W1124 09:27:23.069491 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:23.069497 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:23.069556 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:23.096521 1707070 cri.go:89] found id: ""
	I1124 09:27:23.096535 1707070 logs.go:282] 0 containers: []
	W1124 09:27:23.096542 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:23.096548 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:23.096605 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:23.125327 1707070 cri.go:89] found id: ""
	I1124 09:27:23.125342 1707070 logs.go:282] 0 containers: []
	W1124 09:27:23.125349 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:23.125358 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:23.125367 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:23.180584 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:23.180605 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:23.197372 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:23.197388 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:23.259943 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:23.251679   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:23.252410   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:23.253306   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:23.254866   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:23.255334   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:23.251679   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:23.252410   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:23.253306   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:23.254866   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:23.255334   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:23.259953 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:23.259965 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:23.325045 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:23.325066 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:25.855733 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:25.866329 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:25.866395 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:25.906494 1707070 cri.go:89] found id: ""
	I1124 09:27:25.906508 1707070 logs.go:282] 0 containers: []
	W1124 09:27:25.906516 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:25.906521 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:25.906590 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:25.945205 1707070 cri.go:89] found id: ""
	I1124 09:27:25.945229 1707070 logs.go:282] 0 containers: []
	W1124 09:27:25.945237 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:25.945242 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:25.945301 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:25.970721 1707070 cri.go:89] found id: ""
	I1124 09:27:25.970736 1707070 logs.go:282] 0 containers: []
	W1124 09:27:25.970743 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:25.970749 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:25.970807 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:25.997334 1707070 cri.go:89] found id: ""
	I1124 09:27:25.997348 1707070 logs.go:282] 0 containers: []
	W1124 09:27:25.997355 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:25.997364 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:25.997438 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:26.029916 1707070 cri.go:89] found id: ""
	I1124 09:27:26.029932 1707070 logs.go:282] 0 containers: []
	W1124 09:27:26.029940 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:26.029945 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:26.030007 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:26.057466 1707070 cri.go:89] found id: ""
	I1124 09:27:26.057480 1707070 logs.go:282] 0 containers: []
	W1124 09:27:26.057488 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:26.057494 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:26.057565 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:26.083489 1707070 cri.go:89] found id: ""
	I1124 09:27:26.083503 1707070 logs.go:282] 0 containers: []
	W1124 09:27:26.083511 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:26.083519 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:26.083529 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:26.140569 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:26.140588 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:26.158554 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:26.158571 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:26.230573 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:26.222615   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:26.223218   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:26.224819   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:26.225472   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:26.226976   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:26.222615   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:26.223218   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:26.224819   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:26.225472   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:26.226976   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:26.230583 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:26.230594 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:26.292417 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:26.292436 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:28.819944 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:28.830528 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:28.830587 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:28.854228 1707070 cri.go:89] found id: ""
	I1124 09:27:28.854243 1707070 logs.go:282] 0 containers: []
	W1124 09:27:28.854250 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:28.854260 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:28.854324 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:28.891203 1707070 cri.go:89] found id: ""
	I1124 09:27:28.891217 1707070 logs.go:282] 0 containers: []
	W1124 09:27:28.891224 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:28.891230 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:28.891305 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:28.918573 1707070 cri.go:89] found id: ""
	I1124 09:27:28.918587 1707070 logs.go:282] 0 containers: []
	W1124 09:27:28.918594 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:28.918600 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:28.918665 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:28.944672 1707070 cri.go:89] found id: ""
	I1124 09:27:28.944685 1707070 logs.go:282] 0 containers: []
	W1124 09:27:28.944692 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:28.944708 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:28.944763 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:28.970414 1707070 cri.go:89] found id: ""
	I1124 09:27:28.970429 1707070 logs.go:282] 0 containers: []
	W1124 09:27:28.970436 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:28.970441 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:28.970539 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:28.995438 1707070 cri.go:89] found id: ""
	I1124 09:27:28.995453 1707070 logs.go:282] 0 containers: []
	W1124 09:27:28.995460 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:28.995466 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:28.995526 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:29.023817 1707070 cri.go:89] found id: ""
	I1124 09:27:29.023832 1707070 logs.go:282] 0 containers: []
	W1124 09:27:29.023839 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:29.023847 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:29.023858 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:29.080316 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:29.080336 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:29.097486 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:29.097502 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:29.159875 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:29.151793   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:29.152163   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:29.153608   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:29.154019   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:29.155829   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:29.151793   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:29.152163   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:29.153608   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:29.154019   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:29.155829   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:29.159888 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:29.159907 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:29.223729 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:29.223754 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:31.751641 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:31.761798 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:31.761859 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:31.788691 1707070 cri.go:89] found id: ""
	I1124 09:27:31.788705 1707070 logs.go:282] 0 containers: []
	W1124 09:27:31.788711 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:31.788717 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:31.788776 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:31.812359 1707070 cri.go:89] found id: ""
	I1124 09:27:31.812374 1707070 logs.go:282] 0 containers: []
	W1124 09:27:31.812382 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:31.812387 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:31.812450 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:31.837276 1707070 cri.go:89] found id: ""
	I1124 09:27:31.837289 1707070 logs.go:282] 0 containers: []
	W1124 09:27:31.837296 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:31.837302 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:31.837360 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:31.862818 1707070 cri.go:89] found id: ""
	I1124 09:27:31.862832 1707070 logs.go:282] 0 containers: []
	W1124 09:27:31.862840 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:31.862846 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:31.862903 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:31.904922 1707070 cri.go:89] found id: ""
	I1124 09:27:31.904936 1707070 logs.go:282] 0 containers: []
	W1124 09:27:31.904944 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:31.904950 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:31.905012 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:31.949580 1707070 cri.go:89] found id: ""
	I1124 09:27:31.949594 1707070 logs.go:282] 0 containers: []
	W1124 09:27:31.949601 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:31.949607 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:31.949661 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:31.975157 1707070 cri.go:89] found id: ""
	I1124 09:27:31.975171 1707070 logs.go:282] 0 containers: []
	W1124 09:27:31.975178 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:31.975187 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:31.975198 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:32.004216 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:32.004239 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:32.064444 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:32.064466 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:32.084210 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:32.084229 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:32.152949 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:32.144237   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:32.145124   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:32.147159   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:32.147890   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:32.148900   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:32.144237   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:32.145124   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:32.147159   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:32.147890   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:32.148900   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:32.152963 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:32.152975 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:34.714493 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:34.725033 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:34.725101 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:34.750339 1707070 cri.go:89] found id: ""
	I1124 09:27:34.750352 1707070 logs.go:282] 0 containers: []
	W1124 09:27:34.750359 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:34.750365 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:34.750422 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:34.776574 1707070 cri.go:89] found id: ""
	I1124 09:27:34.776588 1707070 logs.go:282] 0 containers: []
	W1124 09:27:34.776595 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:34.776600 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:34.776656 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:34.801274 1707070 cri.go:89] found id: ""
	I1124 09:27:34.801288 1707070 logs.go:282] 0 containers: []
	W1124 09:27:34.801295 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:34.801300 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:34.801355 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:34.828204 1707070 cri.go:89] found id: ""
	I1124 09:27:34.828217 1707070 logs.go:282] 0 containers: []
	W1124 09:27:34.828224 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:34.828230 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:34.828286 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:34.856488 1707070 cri.go:89] found id: ""
	I1124 09:27:34.856502 1707070 logs.go:282] 0 containers: []
	W1124 09:27:34.856509 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:34.856514 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:34.856571 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:34.882889 1707070 cri.go:89] found id: ""
	I1124 09:27:34.882903 1707070 logs.go:282] 0 containers: []
	W1124 09:27:34.882914 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:34.882919 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:34.882988 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:34.914562 1707070 cri.go:89] found id: ""
	I1124 09:27:34.914576 1707070 logs.go:282] 0 containers: []
	W1124 09:27:34.914583 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:34.914591 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:34.914601 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:34.981562 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:34.981596 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:34.998925 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:34.998941 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:35.070877 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:35.062206   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:35.063028   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:35.064710   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:35.065308   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:35.067060   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:35.062206   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:35.063028   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:35.064710   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:35.065308   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:35.067060   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:35.070899 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:35.070909 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:35.137172 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:35.137193 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:37.666865 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:37.677121 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:37.677182 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:37.702376 1707070 cri.go:89] found id: ""
	I1124 09:27:37.702390 1707070 logs.go:282] 0 containers: []
	W1124 09:27:37.702398 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:37.702407 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:37.702491 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:37.727342 1707070 cri.go:89] found id: ""
	I1124 09:27:37.727355 1707070 logs.go:282] 0 containers: []
	W1124 09:27:37.727363 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:37.727368 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:37.727430 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:37.753323 1707070 cri.go:89] found id: ""
	I1124 09:27:37.753336 1707070 logs.go:282] 0 containers: []
	W1124 09:27:37.753343 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:37.753349 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:37.753409 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:37.781020 1707070 cri.go:89] found id: ""
	I1124 09:27:37.781041 1707070 logs.go:282] 0 containers: []
	W1124 09:27:37.781049 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:37.781055 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:37.781117 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:37.805925 1707070 cri.go:89] found id: ""
	I1124 09:27:37.805939 1707070 logs.go:282] 0 containers: []
	W1124 09:27:37.805946 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:37.805952 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:37.806013 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:37.833036 1707070 cri.go:89] found id: ""
	I1124 09:27:37.833062 1707070 logs.go:282] 0 containers: []
	W1124 09:27:37.833069 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:37.833075 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:37.833140 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:37.860115 1707070 cri.go:89] found id: ""
	I1124 09:27:37.860129 1707070 logs.go:282] 0 containers: []
	W1124 09:27:37.860137 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:37.860145 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:37.860156 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:37.926098 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:37.926118 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:37.960030 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:37.960045 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:38.019375 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:38.019395 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:38.039066 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:38.039085 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:38.110062 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:38.101570   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:38.102692   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:38.104495   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:38.105053   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:38.106366   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:38.101570   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:38.102692   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:38.104495   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:38.105053   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:38.106366   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:40.610482 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:40.620402 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:40.620472 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:40.648289 1707070 cri.go:89] found id: ""
	I1124 09:27:40.648303 1707070 logs.go:282] 0 containers: []
	W1124 09:27:40.648311 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:40.648317 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:40.648373 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:40.672588 1707070 cri.go:89] found id: ""
	I1124 09:27:40.672603 1707070 logs.go:282] 0 containers: []
	W1124 09:27:40.672610 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:40.672616 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:40.672673 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:40.700039 1707070 cri.go:89] found id: ""
	I1124 09:27:40.700053 1707070 logs.go:282] 0 containers: []
	W1124 09:27:40.700060 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:40.700066 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:40.700129 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:40.728494 1707070 cri.go:89] found id: ""
	I1124 09:27:40.728508 1707070 logs.go:282] 0 containers: []
	W1124 09:27:40.728516 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:40.728522 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:40.728582 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:40.753773 1707070 cri.go:89] found id: ""
	I1124 09:27:40.753786 1707070 logs.go:282] 0 containers: []
	W1124 09:27:40.753793 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:40.753798 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:40.753860 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:40.778243 1707070 cri.go:89] found id: ""
	I1124 09:27:40.778257 1707070 logs.go:282] 0 containers: []
	W1124 09:27:40.778264 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:40.778270 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:40.778333 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:40.804316 1707070 cri.go:89] found id: ""
	I1124 09:27:40.804329 1707070 logs.go:282] 0 containers: []
	W1124 09:27:40.804350 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:40.804358 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:40.804370 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:40.821314 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:40.821330 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:40.901213 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:40.878028   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:40.878824   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:40.894654   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:40.895170   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:40.896920   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:40.878028   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:40.878824   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:40.894654   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:40.895170   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:40.896920   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:40.901232 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:40.901242 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:40.972785 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:40.972806 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:41.000947 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:41.000967 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:43.560416 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:43.570821 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:43.570882 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:43.595557 1707070 cri.go:89] found id: ""
	I1124 09:27:43.595571 1707070 logs.go:282] 0 containers: []
	W1124 09:27:43.595579 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:43.595585 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:43.595640 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:43.623980 1707070 cri.go:89] found id: ""
	I1124 09:27:43.623996 1707070 logs.go:282] 0 containers: []
	W1124 09:27:43.624003 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:43.624008 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:43.624074 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:43.649674 1707070 cri.go:89] found id: ""
	I1124 09:27:43.649688 1707070 logs.go:282] 0 containers: []
	W1124 09:27:43.649695 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:43.649701 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:43.649758 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:43.673375 1707070 cri.go:89] found id: ""
	I1124 09:27:43.673388 1707070 logs.go:282] 0 containers: []
	W1124 09:27:43.673397 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:43.673403 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:43.673459 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:43.700917 1707070 cri.go:89] found id: ""
	I1124 09:27:43.700931 1707070 logs.go:282] 0 containers: []
	W1124 09:27:43.700938 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:43.700943 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:43.701000 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:43.725453 1707070 cri.go:89] found id: ""
	I1124 09:27:43.725467 1707070 logs.go:282] 0 containers: []
	W1124 09:27:43.725481 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:43.725487 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:43.725557 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:43.755304 1707070 cri.go:89] found id: ""
	I1124 09:27:43.755318 1707070 logs.go:282] 0 containers: []
	W1124 09:27:43.755326 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:43.755335 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:43.755346 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:43.772549 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:43.772567 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:43.837565 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:43.829378   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:43.829969   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:43.831587   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:43.832265   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:43.833938   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:43.829378   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:43.829969   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:43.831587   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:43.832265   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:43.833938   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:43.837575 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:43.837587 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:43.898949 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:43.898969 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:43.934259 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:43.934277 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:46.497111 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:46.507177 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:46.507251 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:46.531012 1707070 cri.go:89] found id: ""
	I1124 09:27:46.531025 1707070 logs.go:282] 0 containers: []
	W1124 09:27:46.531032 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:46.531038 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:46.531101 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:46.555781 1707070 cri.go:89] found id: ""
	I1124 09:27:46.555795 1707070 logs.go:282] 0 containers: []
	W1124 09:27:46.555802 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:46.555807 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:46.555864 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:46.580956 1707070 cri.go:89] found id: ""
	I1124 09:27:46.580974 1707070 logs.go:282] 0 containers: []
	W1124 09:27:46.580982 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:46.580987 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:46.581055 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:46.606320 1707070 cri.go:89] found id: ""
	I1124 09:27:46.606333 1707070 logs.go:282] 0 containers: []
	W1124 09:27:46.606340 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:46.606346 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:46.606414 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:46.632671 1707070 cri.go:89] found id: ""
	I1124 09:27:46.632685 1707070 logs.go:282] 0 containers: []
	W1124 09:27:46.632692 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:46.632697 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:46.632755 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:46.656948 1707070 cri.go:89] found id: ""
	I1124 09:27:46.656962 1707070 logs.go:282] 0 containers: []
	W1124 09:27:46.656969 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:46.656975 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:46.657037 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:46.681897 1707070 cri.go:89] found id: ""
	I1124 09:27:46.681910 1707070 logs.go:282] 0 containers: []
	W1124 09:27:46.681917 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:46.681925 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:46.681936 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:46.698822 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:46.698839 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:46.763473 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:46.755294   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:46.755864   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:46.757448   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:46.758065   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:46.759847   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:46.755294   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:46.755864   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:46.757448   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:46.758065   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:46.759847   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:46.763499 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:46.763510 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:46.826271 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:46.826293 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:46.855001 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:46.855017 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:49.412865 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:49.423511 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:49.423574 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:49.447618 1707070 cri.go:89] found id: ""
	I1124 09:27:49.447632 1707070 logs.go:282] 0 containers: []
	W1124 09:27:49.447639 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:49.447645 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:49.447705 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:49.476127 1707070 cri.go:89] found id: ""
	I1124 09:27:49.476140 1707070 logs.go:282] 0 containers: []
	W1124 09:27:49.476147 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:49.476154 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:49.476213 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:49.501684 1707070 cri.go:89] found id: ""
	I1124 09:27:49.501697 1707070 logs.go:282] 0 containers: []
	W1124 09:27:49.501705 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:49.501711 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:49.501771 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:49.527011 1707070 cri.go:89] found id: ""
	I1124 09:27:49.527025 1707070 logs.go:282] 0 containers: []
	W1124 09:27:49.527033 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:49.527038 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:49.527098 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:49.552026 1707070 cri.go:89] found id: ""
	I1124 09:27:49.552040 1707070 logs.go:282] 0 containers: []
	W1124 09:27:49.552047 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:49.552053 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:49.552110 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:49.582162 1707070 cri.go:89] found id: ""
	I1124 09:27:49.582189 1707070 logs.go:282] 0 containers: []
	W1124 09:27:49.582196 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:49.582202 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:49.582275 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:49.612653 1707070 cri.go:89] found id: ""
	I1124 09:27:49.612667 1707070 logs.go:282] 0 containers: []
	W1124 09:27:49.612675 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:49.612683 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:49.612693 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:49.668483 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:49.668504 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:49.685463 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:49.685480 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:49.750076 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:49.741868   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:49.742309   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:49.744083   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:49.744608   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:49.746375   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:49.741868   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:49.742309   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:49.744083   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:49.744608   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:49.746375   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:49.750136 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:49.750148 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:49.811614 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:49.811634 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:52.341239 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:52.351722 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:52.351784 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:52.378388 1707070 cri.go:89] found id: ""
	I1124 09:27:52.378402 1707070 logs.go:282] 0 containers: []
	W1124 09:27:52.378410 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:52.378416 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:52.378498 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:52.404052 1707070 cri.go:89] found id: ""
	I1124 09:27:52.404067 1707070 logs.go:282] 0 containers: []
	W1124 09:27:52.404074 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:52.404079 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:52.404138 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:52.428854 1707070 cri.go:89] found id: ""
	I1124 09:27:52.428868 1707070 logs.go:282] 0 containers: []
	W1124 09:27:52.428876 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:52.428882 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:52.428945 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:52.460795 1707070 cri.go:89] found id: ""
	I1124 09:27:52.460808 1707070 logs.go:282] 0 containers: []
	W1124 09:27:52.460815 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:52.460825 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:52.460886 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:52.490351 1707070 cri.go:89] found id: ""
	I1124 09:27:52.490365 1707070 logs.go:282] 0 containers: []
	W1124 09:27:52.490372 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:52.490378 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:52.490438 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:52.515789 1707070 cri.go:89] found id: ""
	I1124 09:27:52.515804 1707070 logs.go:282] 0 containers: []
	W1124 09:27:52.515811 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:52.515816 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:52.515874 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:52.544304 1707070 cri.go:89] found id: ""
	I1124 09:27:52.544318 1707070 logs.go:282] 0 containers: []
	W1124 09:27:52.544326 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:52.544335 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:52.544347 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:52.611718 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:52.603411   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:52.604016   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:52.605628   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:52.606175   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:52.607864   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:52.603411   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:52.604016   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:52.605628   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:52.606175   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:52.607864   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:52.611731 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:52.611743 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:52.679720 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:52.679740 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:52.708422 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:52.708437 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:52.766414 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:52.766433 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:55.285861 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:55.296023 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:55.296086 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:55.324396 1707070 cri.go:89] found id: ""
	I1124 09:27:55.324409 1707070 logs.go:282] 0 containers: []
	W1124 09:27:55.324417 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:55.324422 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:55.324478 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:55.348746 1707070 cri.go:89] found id: ""
	I1124 09:27:55.348760 1707070 logs.go:282] 0 containers: []
	W1124 09:27:55.348767 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:55.348773 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:55.348832 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:55.373685 1707070 cri.go:89] found id: ""
	I1124 09:27:55.373710 1707070 logs.go:282] 0 containers: []
	W1124 09:27:55.373718 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:55.373724 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:55.373780 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:55.399757 1707070 cri.go:89] found id: ""
	I1124 09:27:55.399774 1707070 logs.go:282] 0 containers: []
	W1124 09:27:55.399783 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:55.399789 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:55.399848 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:55.424773 1707070 cri.go:89] found id: ""
	I1124 09:27:55.424788 1707070 logs.go:282] 0 containers: []
	W1124 09:27:55.424795 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:55.424800 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:55.424862 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:55.450083 1707070 cri.go:89] found id: ""
	I1124 09:27:55.450097 1707070 logs.go:282] 0 containers: []
	W1124 09:27:55.450104 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:55.450112 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:55.450170 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:55.474225 1707070 cri.go:89] found id: ""
	I1124 09:27:55.474239 1707070 logs.go:282] 0 containers: []
	W1124 09:27:55.474247 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:55.474254 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:55.474264 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:55.507455 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:55.507477 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:55.563391 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:55.563414 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:55.583115 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:55.583131 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:55.648979 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:55.641409   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:55.642033   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:55.643543   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:55.644021   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:55.645529   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:55.641409   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:55.642033   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:55.643543   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:55.644021   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:55.645529   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:55.648991 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:55.649004 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:58.210584 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:58.221285 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:58.221351 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:58.250526 1707070 cri.go:89] found id: ""
	I1124 09:27:58.250541 1707070 logs.go:282] 0 containers: []
	W1124 09:27:58.250548 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:58.250554 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:58.250612 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:58.275099 1707070 cri.go:89] found id: ""
	I1124 09:27:58.275116 1707070 logs.go:282] 0 containers: []
	W1124 09:27:58.275123 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:58.275129 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:58.275189 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:58.300058 1707070 cri.go:89] found id: ""
	I1124 09:27:58.300075 1707070 logs.go:282] 0 containers: []
	W1124 09:27:58.300082 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:58.300087 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:58.300148 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:58.323564 1707070 cri.go:89] found id: ""
	I1124 09:27:58.323578 1707070 logs.go:282] 0 containers: []
	W1124 09:27:58.323585 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:58.323591 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:58.323648 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:58.348441 1707070 cri.go:89] found id: ""
	I1124 09:27:58.348455 1707070 logs.go:282] 0 containers: []
	W1124 09:27:58.348463 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:58.348468 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:58.348527 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:58.374283 1707070 cri.go:89] found id: ""
	I1124 09:27:58.374297 1707070 logs.go:282] 0 containers: []
	W1124 09:27:58.374305 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:58.374310 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:58.374371 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:58.400624 1707070 cri.go:89] found id: ""
	I1124 09:27:58.400638 1707070 logs.go:282] 0 containers: []
	W1124 09:27:58.400645 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:58.400653 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:58.400664 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:58.457055 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:58.457075 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:58.474204 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:58.474236 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:58.538738 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:58.530985   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:58.531628   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:58.533238   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:58.533555   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:58.535049   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:58.530985   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:58.531628   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:58.533238   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:58.533555   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:58.535049   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:58.538748 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:58.538761 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:58.601043 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:58.601064 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:01.129158 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:01.152628 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:01.152709 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:01.199688 1707070 cri.go:89] found id: ""
	I1124 09:28:01.199703 1707070 logs.go:282] 0 containers: []
	W1124 09:28:01.199710 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:01.199716 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:01.199778 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:01.226293 1707070 cri.go:89] found id: ""
	I1124 09:28:01.226307 1707070 logs.go:282] 0 containers: []
	W1124 09:28:01.226314 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:01.226319 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:01.226379 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:01.252021 1707070 cri.go:89] found id: ""
	I1124 09:28:01.252036 1707070 logs.go:282] 0 containers: []
	W1124 09:28:01.252043 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:01.252049 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:01.252108 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:01.278563 1707070 cri.go:89] found id: ""
	I1124 09:28:01.278577 1707070 logs.go:282] 0 containers: []
	W1124 09:28:01.278585 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:01.278591 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:01.278697 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:01.304781 1707070 cri.go:89] found id: ""
	I1124 09:28:01.304808 1707070 logs.go:282] 0 containers: []
	W1124 09:28:01.304816 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:01.304822 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:01.304900 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:01.330549 1707070 cri.go:89] found id: ""
	I1124 09:28:01.330574 1707070 logs.go:282] 0 containers: []
	W1124 09:28:01.330581 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:01.330586 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:01.330657 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:01.355624 1707070 cri.go:89] found id: ""
	I1124 09:28:01.355646 1707070 logs.go:282] 0 containers: []
	W1124 09:28:01.355654 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:01.355661 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:01.355673 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:01.411485 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:01.411504 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:01.428912 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:01.428927 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:01.493859 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:01.485758   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:01.486490   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:01.488127   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:01.488656   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:01.490257   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:01.485758   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:01.486490   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:01.488127   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:01.488656   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:01.490257   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:01.493881 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:01.493892 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:01.554787 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:01.554808 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:04.088481 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:04.099124 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:04.099191 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:04.123836 1707070 cri.go:89] found id: ""
	I1124 09:28:04.123849 1707070 logs.go:282] 0 containers: []
	W1124 09:28:04.123857 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:04.123862 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:04.123927 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:04.159485 1707070 cri.go:89] found id: ""
	I1124 09:28:04.159499 1707070 logs.go:282] 0 containers: []
	W1124 09:28:04.159506 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:04.159511 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:04.159572 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:04.187075 1707070 cri.go:89] found id: ""
	I1124 09:28:04.187089 1707070 logs.go:282] 0 containers: []
	W1124 09:28:04.187106 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:04.187112 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:04.187169 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:04.217664 1707070 cri.go:89] found id: ""
	I1124 09:28:04.217677 1707070 logs.go:282] 0 containers: []
	W1124 09:28:04.217696 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:04.217702 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:04.217769 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:04.244060 1707070 cri.go:89] found id: ""
	I1124 09:28:04.244075 1707070 logs.go:282] 0 containers: []
	W1124 09:28:04.244082 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:04.244087 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:04.244151 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:04.269297 1707070 cri.go:89] found id: ""
	I1124 09:28:04.269311 1707070 logs.go:282] 0 containers: []
	W1124 09:28:04.269318 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:04.269323 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:04.269382 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:04.296714 1707070 cri.go:89] found id: ""
	I1124 09:28:04.296730 1707070 logs.go:282] 0 containers: []
	W1124 09:28:04.296737 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:04.296745 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:04.296760 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:04.352538 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:04.352558 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:04.370334 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:04.370357 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:04.439006 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:04.429890   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:04.430808   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:04.432656   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:04.433242   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:04.435153   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:04.429890   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:04.430808   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:04.432656   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:04.433242   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:04.435153   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:04.439018 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:04.439027 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:04.503050 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:04.503072 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:07.038611 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:07.049789 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:07.049861 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:07.074863 1707070 cri.go:89] found id: ""
	I1124 09:28:07.074878 1707070 logs.go:282] 0 containers: []
	W1124 09:28:07.074885 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:07.074893 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:07.074950 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:07.099042 1707070 cri.go:89] found id: ""
	I1124 09:28:07.099057 1707070 logs.go:282] 0 containers: []
	W1124 09:28:07.099064 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:07.099070 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:07.099131 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:07.123608 1707070 cri.go:89] found id: ""
	I1124 09:28:07.123622 1707070 logs.go:282] 0 containers: []
	W1124 09:28:07.123630 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:07.123635 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:07.123706 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:07.151391 1707070 cri.go:89] found id: ""
	I1124 09:28:07.151405 1707070 logs.go:282] 0 containers: []
	W1124 09:28:07.151412 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:07.151418 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:07.151475 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:07.182488 1707070 cri.go:89] found id: ""
	I1124 09:28:07.182502 1707070 logs.go:282] 0 containers: []
	W1124 09:28:07.182510 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:07.182515 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:07.182581 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:07.207523 1707070 cri.go:89] found id: ""
	I1124 09:28:07.207537 1707070 logs.go:282] 0 containers: []
	W1124 09:28:07.207546 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:07.207552 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:07.207614 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:07.233412 1707070 cri.go:89] found id: ""
	I1124 09:28:07.233426 1707070 logs.go:282] 0 containers: []
	W1124 09:28:07.233433 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:07.233441 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:07.233451 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:07.288900 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:07.288922 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:07.306472 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:07.306493 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:07.368097 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:07.360574   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:07.360956   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:07.362483   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:07.362820   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:07.364269   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:07.360574   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:07.360956   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:07.362483   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:07.362820   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:07.364269   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:07.368108 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:07.368121 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:07.429983 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:07.430002 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:09.965289 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:09.976378 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:09.976448 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:10.015687 1707070 cri.go:89] found id: ""
	I1124 09:28:10.015705 1707070 logs.go:282] 0 containers: []
	W1124 09:28:10.015714 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:10.015721 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:10.015811 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:10.042717 1707070 cri.go:89] found id: ""
	I1124 09:28:10.042731 1707070 logs.go:282] 0 containers: []
	W1124 09:28:10.042738 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:10.042743 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:10.042805 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:10.069226 1707070 cri.go:89] found id: ""
	I1124 09:28:10.069240 1707070 logs.go:282] 0 containers: []
	W1124 09:28:10.069259 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:10.069265 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:10.069336 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:10.094576 1707070 cri.go:89] found id: ""
	I1124 09:28:10.094591 1707070 logs.go:282] 0 containers: []
	W1124 09:28:10.094599 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:10.094604 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:10.094683 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:10.120910 1707070 cri.go:89] found id: ""
	I1124 09:28:10.120925 1707070 logs.go:282] 0 containers: []
	W1124 09:28:10.120932 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:10.120938 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:10.121007 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:10.148454 1707070 cri.go:89] found id: ""
	I1124 09:28:10.148467 1707070 logs.go:282] 0 containers: []
	W1124 09:28:10.148476 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:10.148482 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:10.148545 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:10.180342 1707070 cri.go:89] found id: ""
	I1124 09:28:10.180356 1707070 logs.go:282] 0 containers: []
	W1124 09:28:10.180363 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:10.180377 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:10.180387 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:10.237982 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:10.238001 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:10.254875 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:10.254891 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:10.315902 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:10.307876   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:10.308640   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:10.310183   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:10.310727   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:10.312228   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:10.307876   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:10.308640   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:10.310183   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:10.310727   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:10.312228   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:10.315912 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:10.315922 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:10.381257 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:10.381276 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:12.913595 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:12.923674 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:12.923734 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:12.947804 1707070 cri.go:89] found id: ""
	I1124 09:28:12.947818 1707070 logs.go:282] 0 containers: []
	W1124 09:28:12.947826 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:12.947832 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:12.947892 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:12.971923 1707070 cri.go:89] found id: ""
	I1124 09:28:12.971937 1707070 logs.go:282] 0 containers: []
	W1124 09:28:12.971944 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:12.971956 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:12.972017 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:12.996325 1707070 cri.go:89] found id: ""
	I1124 09:28:12.996339 1707070 logs.go:282] 0 containers: []
	W1124 09:28:12.996357 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:12.996364 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:12.996436 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:13.022187 1707070 cri.go:89] found id: ""
	I1124 09:28:13.022203 1707070 logs.go:282] 0 containers: []
	W1124 09:28:13.022211 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:13.022224 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:13.022296 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:13.048161 1707070 cri.go:89] found id: ""
	I1124 09:28:13.048184 1707070 logs.go:282] 0 containers: []
	W1124 09:28:13.048192 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:13.048198 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:13.048262 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:13.073539 1707070 cri.go:89] found id: ""
	I1124 09:28:13.073564 1707070 logs.go:282] 0 containers: []
	W1124 09:28:13.073571 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:13.073578 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:13.073655 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:13.098089 1707070 cri.go:89] found id: ""
	I1124 09:28:13.098106 1707070 logs.go:282] 0 containers: []
	W1124 09:28:13.098114 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:13.098122 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:13.098132 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:13.140239 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:13.140255 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:13.197847 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:13.197865 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:13.217667 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:13.217686 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:13.281312 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:13.272865   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:13.273748   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:13.275370   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:13.275717   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:13.277237   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:13.272865   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:13.273748   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:13.275370   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:13.275717   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:13.277237   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:13.281322 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:13.281334 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:15.842684 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:15.853250 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:15.853311 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:15.878981 1707070 cri.go:89] found id: ""
	I1124 09:28:15.878995 1707070 logs.go:282] 0 containers: []
	W1124 09:28:15.879030 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:15.879036 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:15.879099 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:15.904674 1707070 cri.go:89] found id: ""
	I1124 09:28:15.904687 1707070 logs.go:282] 0 containers: []
	W1124 09:28:15.904695 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:15.904700 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:15.904757 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:15.929766 1707070 cri.go:89] found id: ""
	I1124 09:28:15.929780 1707070 logs.go:282] 0 containers: []
	W1124 09:28:15.929787 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:15.929793 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:15.929851 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:15.955453 1707070 cri.go:89] found id: ""
	I1124 09:28:15.955468 1707070 logs.go:282] 0 containers: []
	W1124 09:28:15.955475 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:15.955485 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:15.955543 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:15.983839 1707070 cri.go:89] found id: ""
	I1124 09:28:15.983854 1707070 logs.go:282] 0 containers: []
	W1124 09:28:15.983861 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:15.983866 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:15.983924 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:16.014730 1707070 cri.go:89] found id: ""
	I1124 09:28:16.014744 1707070 logs.go:282] 0 containers: []
	W1124 09:28:16.014752 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:16.014757 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:16.014820 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:16.046753 1707070 cri.go:89] found id: ""
	I1124 09:28:16.046767 1707070 logs.go:282] 0 containers: []
	W1124 09:28:16.046775 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:16.046783 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:16.046794 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:16.064199 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:16.064217 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:16.139691 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:16.122247   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:16.122923   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:16.124768   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:16.125231   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:16.126838   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:16.122247   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:16.122923   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:16.124768   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:16.125231   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:16.126838   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:16.139701 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:16.139711 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:16.206802 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:16.206822 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:16.234674 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:16.234690 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:18.790282 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:18.801848 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:18.801912 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:18.827821 1707070 cri.go:89] found id: ""
	I1124 09:28:18.827836 1707070 logs.go:282] 0 containers: []
	W1124 09:28:18.827843 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:18.827849 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:18.827905 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:18.852169 1707070 cri.go:89] found id: ""
	I1124 09:28:18.852184 1707070 logs.go:282] 0 containers: []
	W1124 09:28:18.852191 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:18.852196 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:18.852253 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:18.878610 1707070 cri.go:89] found id: ""
	I1124 09:28:18.878625 1707070 logs.go:282] 0 containers: []
	W1124 09:28:18.878633 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:18.878638 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:18.878702 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:18.903384 1707070 cri.go:89] found id: ""
	I1124 09:28:18.903403 1707070 logs.go:282] 0 containers: []
	W1124 09:28:18.903410 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:18.903416 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:18.903476 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:18.928519 1707070 cri.go:89] found id: ""
	I1124 09:28:18.928534 1707070 logs.go:282] 0 containers: []
	W1124 09:28:18.928542 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:18.928547 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:18.928609 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:18.956808 1707070 cri.go:89] found id: ""
	I1124 09:28:18.956823 1707070 logs.go:282] 0 containers: []
	W1124 09:28:18.956830 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:18.956836 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:18.956893 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:18.985113 1707070 cri.go:89] found id: ""
	I1124 09:28:18.985127 1707070 logs.go:282] 0 containers: []
	W1124 09:28:18.985134 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:18.985142 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:18.985152 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:19.019130 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:19.019146 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:19.075193 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:19.075213 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:19.092291 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:19.092306 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:19.162819 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:19.154959   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:19.155361   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:19.156834   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:19.157156   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:19.158629   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:19.154959   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:19.155361   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:19.156834   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:19.157156   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:19.158629   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:19.162839 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:19.162850 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:21.737895 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:21.748053 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:21.748120 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:21.773590 1707070 cri.go:89] found id: ""
	I1124 09:28:21.773604 1707070 logs.go:282] 0 containers: []
	W1124 09:28:21.773611 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:21.773618 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:21.773679 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:21.800809 1707070 cri.go:89] found id: ""
	I1124 09:28:21.800866 1707070 logs.go:282] 0 containers: []
	W1124 09:28:21.800874 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:21.800880 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:21.800938 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:21.826581 1707070 cri.go:89] found id: ""
	I1124 09:28:21.826594 1707070 logs.go:282] 0 containers: []
	W1124 09:28:21.826602 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:21.826607 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:21.826668 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:21.856267 1707070 cri.go:89] found id: ""
	I1124 09:28:21.856282 1707070 logs.go:282] 0 containers: []
	W1124 09:28:21.856289 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:21.856295 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:21.856354 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:21.885138 1707070 cri.go:89] found id: ""
	I1124 09:28:21.885152 1707070 logs.go:282] 0 containers: []
	W1124 09:28:21.885160 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:21.885165 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:21.885224 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:21.909643 1707070 cri.go:89] found id: ""
	I1124 09:28:21.909657 1707070 logs.go:282] 0 containers: []
	W1124 09:28:21.909665 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:21.909671 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:21.909727 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:21.936792 1707070 cri.go:89] found id: ""
	I1124 09:28:21.936806 1707070 logs.go:282] 0 containers: []
	W1124 09:28:21.936813 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:21.936821 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:21.936831 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:21.993870 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:21.993890 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:22.011453 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:22.011474 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:22.078376 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:22.069998   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:22.070791   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:22.072423   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:22.073020   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:22.074616   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:22.069998   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:22.070791   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:22.072423   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:22.073020   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:22.074616   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:22.078387 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:22.078398 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:22.140934 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:22.140953 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:24.669313 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:24.679257 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:24.679328 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:24.707632 1707070 cri.go:89] found id: ""
	I1124 09:28:24.707647 1707070 logs.go:282] 0 containers: []
	W1124 09:28:24.707654 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:24.707660 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:24.707720 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:24.733688 1707070 cri.go:89] found id: ""
	I1124 09:28:24.733702 1707070 logs.go:282] 0 containers: []
	W1124 09:28:24.733710 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:24.733715 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:24.733773 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:24.759056 1707070 cri.go:89] found id: ""
	I1124 09:28:24.759071 1707070 logs.go:282] 0 containers: []
	W1124 09:28:24.759078 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:24.759084 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:24.759143 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:24.789918 1707070 cri.go:89] found id: ""
	I1124 09:28:24.789931 1707070 logs.go:282] 0 containers: []
	W1124 09:28:24.789938 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:24.789944 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:24.790003 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:24.814684 1707070 cri.go:89] found id: ""
	I1124 09:28:24.814698 1707070 logs.go:282] 0 containers: []
	W1124 09:28:24.814709 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:24.814714 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:24.814773 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:24.839467 1707070 cri.go:89] found id: ""
	I1124 09:28:24.839489 1707070 logs.go:282] 0 containers: []
	W1124 09:28:24.839497 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:24.839503 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:24.839568 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:24.863902 1707070 cri.go:89] found id: ""
	I1124 09:28:24.863917 1707070 logs.go:282] 0 containers: []
	W1124 09:28:24.863925 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:24.863933 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:24.863943 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:24.919300 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:24.919320 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:24.936150 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:24.936167 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:24.998414 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:24.990181   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:24.990882   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:24.992541   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:24.993206   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:24.994900   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:24.990181   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:24.990882   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:24.992541   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:24.993206   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:24.994900   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:24.998425 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:24.998435 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:25.062735 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:25.062756 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:27.591381 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:27.601598 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:27.601658 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:27.626062 1707070 cri.go:89] found id: ""
	I1124 09:28:27.626076 1707070 logs.go:282] 0 containers: []
	W1124 09:28:27.626084 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:27.626090 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:27.626152 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:27.654571 1707070 cri.go:89] found id: ""
	I1124 09:28:27.654591 1707070 logs.go:282] 0 containers: []
	W1124 09:28:27.654599 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:27.654604 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:27.654664 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:27.679294 1707070 cri.go:89] found id: ""
	I1124 09:28:27.679308 1707070 logs.go:282] 0 containers: []
	W1124 09:28:27.679315 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:27.679320 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:27.679377 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:27.702575 1707070 cri.go:89] found id: ""
	I1124 09:28:27.702588 1707070 logs.go:282] 0 containers: []
	W1124 09:28:27.702595 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:27.702601 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:27.702657 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:27.728251 1707070 cri.go:89] found id: ""
	I1124 09:28:27.728266 1707070 logs.go:282] 0 containers: []
	W1124 09:28:27.728273 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:27.728279 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:27.728339 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:27.752789 1707070 cri.go:89] found id: ""
	I1124 09:28:27.752802 1707070 logs.go:282] 0 containers: []
	W1124 09:28:27.752809 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:27.752815 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:27.752874 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:27.776833 1707070 cri.go:89] found id: ""
	I1124 09:28:27.776847 1707070 logs.go:282] 0 containers: []
	W1124 09:28:27.776854 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:27.776862 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:27.776871 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:27.837612 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:27.837637 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:27.866873 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:27.866890 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:27.925473 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:27.925492 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:27.942415 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:27.942432 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:28.014797 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:27.999267   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:28.000058   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:28.002028   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:28.002995   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:28.005197   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:27.999267   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:28.000058   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:28.002028   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:28.002995   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:28.005197   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:30.515707 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:30.526026 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:30.526102 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:30.550904 1707070 cri.go:89] found id: ""
	I1124 09:28:30.550918 1707070 logs.go:282] 0 containers: []
	W1124 09:28:30.550925 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:30.550931 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:30.550996 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:30.580837 1707070 cri.go:89] found id: ""
	I1124 09:28:30.580851 1707070 logs.go:282] 0 containers: []
	W1124 09:28:30.580859 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:30.580864 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:30.580920 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:30.605291 1707070 cri.go:89] found id: ""
	I1124 09:28:30.605305 1707070 logs.go:282] 0 containers: []
	W1124 09:28:30.605312 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:30.605318 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:30.605376 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:30.630158 1707070 cri.go:89] found id: ""
	I1124 09:28:30.630172 1707070 logs.go:282] 0 containers: []
	W1124 09:28:30.630181 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:30.630187 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:30.630254 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:30.653754 1707070 cri.go:89] found id: ""
	I1124 09:28:30.653772 1707070 logs.go:282] 0 containers: []
	W1124 09:28:30.653785 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:30.653790 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:30.653868 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:30.679137 1707070 cri.go:89] found id: ""
	I1124 09:28:30.679150 1707070 logs.go:282] 0 containers: []
	W1124 09:28:30.679157 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:30.679163 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:30.679221 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:30.703850 1707070 cri.go:89] found id: ""
	I1124 09:28:30.703864 1707070 logs.go:282] 0 containers: []
	W1124 09:28:30.703871 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:30.703879 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:30.703888 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:30.772547 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:30.764218   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:30.764926   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:30.766593   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:30.767134   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:30.768991   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:30.764218   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:30.764926   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:30.766593   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:30.767134   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:30.768991   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:30.772557 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:30.772568 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:30.834024 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:30.834043 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:30.862031 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:30.862046 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:30.920292 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:30.920311 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:33.438606 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:33.448762 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:33.448822 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:33.481032 1707070 cri.go:89] found id: ""
	I1124 09:28:33.481046 1707070 logs.go:282] 0 containers: []
	W1124 09:28:33.481053 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:33.481060 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:33.481117 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:33.504561 1707070 cri.go:89] found id: ""
	I1124 09:28:33.504576 1707070 logs.go:282] 0 containers: []
	W1124 09:28:33.504583 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:33.504589 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:33.504654 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:33.528885 1707070 cri.go:89] found id: ""
	I1124 09:28:33.528899 1707070 logs.go:282] 0 containers: []
	W1124 09:28:33.528906 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:33.528915 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:33.528972 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:33.553244 1707070 cri.go:89] found id: ""
	I1124 09:28:33.553258 1707070 logs.go:282] 0 containers: []
	W1124 09:28:33.553271 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:33.553277 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:33.553334 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:33.578519 1707070 cri.go:89] found id: ""
	I1124 09:28:33.578533 1707070 logs.go:282] 0 containers: []
	W1124 09:28:33.578541 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:33.578546 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:33.578607 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:33.602708 1707070 cri.go:89] found id: ""
	I1124 09:28:33.602721 1707070 logs.go:282] 0 containers: []
	W1124 09:28:33.602729 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:33.602734 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:33.602791 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:33.626894 1707070 cri.go:89] found id: ""
	I1124 09:28:33.626908 1707070 logs.go:282] 0 containers: []
	W1124 09:28:33.626916 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:33.626923 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:33.626934 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:33.684867 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:33.684887 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:33.701817 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:33.701834 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:33.775161 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:33.766757   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:33.767480   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:33.769022   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:33.769484   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:33.770951   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:33.766757   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:33.767480   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:33.769022   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:33.769484   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:33.770951   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:33.775172 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:33.775185 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:33.837667 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:33.837688 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:36.365266 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:36.376558 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:36.376622 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:36.412692 1707070 cri.go:89] found id: ""
	I1124 09:28:36.412706 1707070 logs.go:282] 0 containers: []
	W1124 09:28:36.412714 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:36.412719 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:36.412777 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:36.448943 1707070 cri.go:89] found id: ""
	I1124 09:28:36.448957 1707070 logs.go:282] 0 containers: []
	W1124 09:28:36.448964 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:36.448970 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:36.449031 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:36.474906 1707070 cri.go:89] found id: ""
	I1124 09:28:36.474920 1707070 logs.go:282] 0 containers: []
	W1124 09:28:36.474928 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:36.474934 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:36.474990 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:36.503770 1707070 cri.go:89] found id: ""
	I1124 09:28:36.503784 1707070 logs.go:282] 0 containers: []
	W1124 09:28:36.503792 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:36.503797 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:36.503863 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:36.532858 1707070 cri.go:89] found id: ""
	I1124 09:28:36.532872 1707070 logs.go:282] 0 containers: []
	W1124 09:28:36.532880 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:36.532885 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:36.532944 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:36.557874 1707070 cri.go:89] found id: ""
	I1124 09:28:36.557889 1707070 logs.go:282] 0 containers: []
	W1124 09:28:36.557896 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:36.557902 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:36.557959 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:36.582175 1707070 cri.go:89] found id: ""
	I1124 09:28:36.582189 1707070 logs.go:282] 0 containers: []
	W1124 09:28:36.582204 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:36.582212 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:36.582230 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:36.645586 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:36.637487   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:36.638140   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:36.639873   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:36.640429   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:36.641968   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:36.637487   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:36.638140   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:36.639873   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:36.640429   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:36.641968   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:36.645596 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:36.645607 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:36.708211 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:36.708231 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:36.740877 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:36.740894 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:36.798376 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:36.798396 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:39.316746 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:39.327050 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:39.327111 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:39.351416 1707070 cri.go:89] found id: ""
	I1124 09:28:39.351430 1707070 logs.go:282] 0 containers: []
	W1124 09:28:39.351438 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:39.351444 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:39.351500 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:39.375341 1707070 cri.go:89] found id: ""
	I1124 09:28:39.375355 1707070 logs.go:282] 0 containers: []
	W1124 09:28:39.375362 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:39.375367 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:39.375425 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:39.402220 1707070 cri.go:89] found id: ""
	I1124 09:28:39.402235 1707070 logs.go:282] 0 containers: []
	W1124 09:28:39.402241 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:39.402247 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:39.402306 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:39.434081 1707070 cri.go:89] found id: ""
	I1124 09:28:39.434094 1707070 logs.go:282] 0 containers: []
	W1124 09:28:39.434101 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:39.434107 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:39.434167 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:39.467514 1707070 cri.go:89] found id: ""
	I1124 09:28:39.467528 1707070 logs.go:282] 0 containers: []
	W1124 09:28:39.467535 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:39.467540 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:39.467597 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:39.500947 1707070 cri.go:89] found id: ""
	I1124 09:28:39.500961 1707070 logs.go:282] 0 containers: []
	W1124 09:28:39.500968 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:39.500974 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:39.501034 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:39.526637 1707070 cri.go:89] found id: ""
	I1124 09:28:39.526651 1707070 logs.go:282] 0 containers: []
	W1124 09:28:39.526658 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:39.526666 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:39.526676 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:39.582247 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:39.582268 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:39.599751 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:39.599767 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:39.668271 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:39.660949   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:39.661446   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:39.663149   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:39.663643   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:39.664706   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:39.660949   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:39.661446   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:39.663149   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:39.663643   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:39.664706   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:39.668281 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:39.668294 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:39.730931 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:39.730951 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:42.260305 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:42.272405 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:42.272489 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:42.300817 1707070 cri.go:89] found id: ""
	I1124 09:28:42.300842 1707070 logs.go:282] 0 containers: []
	W1124 09:28:42.300850 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:42.300856 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:42.300921 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:42.327350 1707070 cri.go:89] found id: ""
	I1124 09:28:42.327368 1707070 logs.go:282] 0 containers: []
	W1124 09:28:42.327377 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:42.327382 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:42.327441 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:42.352768 1707070 cri.go:89] found id: ""
	I1124 09:28:42.352781 1707070 logs.go:282] 0 containers: []
	W1124 09:28:42.352788 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:42.352794 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:42.352858 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:42.384996 1707070 cri.go:89] found id: ""
	I1124 09:28:42.385016 1707070 logs.go:282] 0 containers: []
	W1124 09:28:42.385024 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:42.385035 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:42.385109 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:42.433916 1707070 cri.go:89] found id: ""
	I1124 09:28:42.433942 1707070 logs.go:282] 0 containers: []
	W1124 09:28:42.433963 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:42.433974 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:42.434041 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:42.469962 1707070 cri.go:89] found id: ""
	I1124 09:28:42.469976 1707070 logs.go:282] 0 containers: []
	W1124 09:28:42.469983 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:42.469989 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:42.470045 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:42.494905 1707070 cri.go:89] found id: ""
	I1124 09:28:42.494919 1707070 logs.go:282] 0 containers: []
	W1124 09:28:42.494926 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:42.494934 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:42.494944 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:42.551276 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:42.551295 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:42.568521 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:42.568538 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:42.631652 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:42.623578   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:42.624203   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:42.625718   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:42.626134   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:42.627653   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:42.623578   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:42.624203   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:42.625718   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:42.626134   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:42.627653   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:42.631662 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:42.631689 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:42.697554 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:42.697573 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:45.228012 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:45.242540 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:45.242663 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:45.285651 1707070 cri.go:89] found id: ""
	I1124 09:28:45.285666 1707070 logs.go:282] 0 containers: []
	W1124 09:28:45.285673 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:45.285679 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:45.285747 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:45.315729 1707070 cri.go:89] found id: ""
	I1124 09:28:45.315744 1707070 logs.go:282] 0 containers: []
	W1124 09:28:45.315759 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:45.315766 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:45.315838 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:45.342027 1707070 cri.go:89] found id: ""
	I1124 09:28:45.342041 1707070 logs.go:282] 0 containers: []
	W1124 09:28:45.342048 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:45.342053 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:45.342112 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:45.368019 1707070 cri.go:89] found id: ""
	I1124 09:28:45.368033 1707070 logs.go:282] 0 containers: []
	W1124 09:28:45.368040 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:45.368046 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:45.368102 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:45.406091 1707070 cri.go:89] found id: ""
	I1124 09:28:45.406104 1707070 logs.go:282] 0 containers: []
	W1124 09:28:45.406112 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:45.406119 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:45.406176 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:45.432356 1707070 cri.go:89] found id: ""
	I1124 09:28:45.432369 1707070 logs.go:282] 0 containers: []
	W1124 09:28:45.432377 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:45.432382 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:45.432449 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:45.465291 1707070 cri.go:89] found id: ""
	I1124 09:28:45.465315 1707070 logs.go:282] 0 containers: []
	W1124 09:28:45.465324 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:45.465332 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:45.465345 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:45.527756 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:45.527784 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:45.544616 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:45.544642 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:45.606842 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:45.598345   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:45.599427   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:45.600949   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:45.601550   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:45.603105   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:45.598345   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:45.599427   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:45.600949   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:45.601550   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:45.603105   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:45.606853 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:45.606866 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:45.669056 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:45.669077 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:48.198708 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:48.210384 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:48.210449 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:48.235268 1707070 cri.go:89] found id: ""
	I1124 09:28:48.235282 1707070 logs.go:282] 0 containers: []
	W1124 09:28:48.235289 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:48.235295 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:48.235357 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:48.261413 1707070 cri.go:89] found id: ""
	I1124 09:28:48.261427 1707070 logs.go:282] 0 containers: []
	W1124 09:28:48.261434 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:48.261439 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:48.261496 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:48.291100 1707070 cri.go:89] found id: ""
	I1124 09:28:48.291114 1707070 logs.go:282] 0 containers: []
	W1124 09:28:48.291122 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:48.291127 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:48.291186 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:48.326388 1707070 cri.go:89] found id: ""
	I1124 09:28:48.326412 1707070 logs.go:282] 0 containers: []
	W1124 09:28:48.326420 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:48.326426 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:48.326499 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:48.356212 1707070 cri.go:89] found id: ""
	I1124 09:28:48.356227 1707070 logs.go:282] 0 containers: []
	W1124 09:28:48.356234 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:48.356240 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:48.356299 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:48.384677 1707070 cri.go:89] found id: ""
	I1124 09:28:48.384690 1707070 logs.go:282] 0 containers: []
	W1124 09:28:48.384697 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:48.384703 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:48.384759 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:48.422001 1707070 cri.go:89] found id: ""
	I1124 09:28:48.422015 1707070 logs.go:282] 0 containers: []
	W1124 09:28:48.422022 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:48.422030 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:48.422040 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:48.492980 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:48.493001 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:48.522367 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:48.522383 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:48.577847 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:48.577866 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:48.594803 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:48.594821 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:48.662402 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:48.654176   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:48.655485   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:48.656131   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:48.657084   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:48.658755   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:48.654176   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:48.655485   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:48.656131   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:48.657084   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:48.658755   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:51.162680 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:51.173802 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:51.173865 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:51.200124 1707070 cri.go:89] found id: ""
	I1124 09:28:51.200146 1707070 logs.go:282] 0 containers: []
	W1124 09:28:51.200155 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:51.200161 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:51.200220 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:51.225309 1707070 cri.go:89] found id: ""
	I1124 09:28:51.225323 1707070 logs.go:282] 0 containers: []
	W1124 09:28:51.225330 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:51.225335 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:51.225392 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:51.249971 1707070 cri.go:89] found id: ""
	I1124 09:28:51.249985 1707070 logs.go:282] 0 containers: []
	W1124 09:28:51.249992 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:51.249997 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:51.250053 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:51.275848 1707070 cri.go:89] found id: ""
	I1124 09:28:51.275861 1707070 logs.go:282] 0 containers: []
	W1124 09:28:51.275868 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:51.275874 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:51.275929 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:51.304356 1707070 cri.go:89] found id: ""
	I1124 09:28:51.304370 1707070 logs.go:282] 0 containers: []
	W1124 09:28:51.304386 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:51.304392 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:51.304450 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:51.329000 1707070 cri.go:89] found id: ""
	I1124 09:28:51.329015 1707070 logs.go:282] 0 containers: []
	W1124 09:28:51.329021 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:51.329027 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:51.329099 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:51.357783 1707070 cri.go:89] found id: ""
	I1124 09:28:51.357796 1707070 logs.go:282] 0 containers: []
	W1124 09:28:51.357804 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:51.357811 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:51.357820 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:51.426561 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:51.426582 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:51.456185 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:51.456202 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:51.512504 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:51.512525 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:51.530860 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:51.530877 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:51.596556 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:51.586703   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:51.587508   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:51.589233   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:51.589675   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:51.591800   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:51.586703   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:51.587508   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:51.589233   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:51.589675   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:51.591800   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:54.097448 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:54.107646 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:54.107710 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:54.131850 1707070 cri.go:89] found id: ""
	I1124 09:28:54.131869 1707070 logs.go:282] 0 containers: []
	W1124 09:28:54.131877 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:54.131883 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:54.131950 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:54.157778 1707070 cri.go:89] found id: ""
	I1124 09:28:54.157793 1707070 logs.go:282] 0 containers: []
	W1124 09:28:54.157800 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:54.157806 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:54.157871 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:54.183638 1707070 cri.go:89] found id: ""
	I1124 09:28:54.183661 1707070 logs.go:282] 0 containers: []
	W1124 09:28:54.183668 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:54.183676 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:54.183745 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:54.208654 1707070 cri.go:89] found id: ""
	I1124 09:28:54.208668 1707070 logs.go:282] 0 containers: []
	W1124 09:28:54.208675 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:54.208680 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:54.208741 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:54.237302 1707070 cri.go:89] found id: ""
	I1124 09:28:54.237317 1707070 logs.go:282] 0 containers: []
	W1124 09:28:54.237325 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:54.237331 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:54.237390 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:54.261089 1707070 cri.go:89] found id: ""
	I1124 09:28:54.261111 1707070 logs.go:282] 0 containers: []
	W1124 09:28:54.261119 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:54.261124 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:54.261195 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:54.289315 1707070 cri.go:89] found id: ""
	I1124 09:28:54.289337 1707070 logs.go:282] 0 containers: []
	W1124 09:28:54.289345 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:54.289353 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:54.289363 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:54.350840 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:54.350861 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:54.391880 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:54.391897 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:54.457044 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:54.457066 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:54.475507 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:54.475525 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:54.538358 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:54.529952   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:54.530805   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:54.531583   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:54.533115   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:54.533777   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:54.529952   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:54.530805   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:54.531583   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:54.533115   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:54.533777   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:57.040068 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:57.050642 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:57.050707 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:57.075811 1707070 cri.go:89] found id: ""
	I1124 09:28:57.075824 1707070 logs.go:282] 0 containers: []
	W1124 09:28:57.075832 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:57.075837 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:57.075899 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:57.106029 1707070 cri.go:89] found id: ""
	I1124 09:28:57.106044 1707070 logs.go:282] 0 containers: []
	W1124 09:28:57.106052 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:57.106058 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:57.106114 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:57.132742 1707070 cri.go:89] found id: ""
	I1124 09:28:57.132756 1707070 logs.go:282] 0 containers: []
	W1124 09:28:57.132763 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:57.132768 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:57.132825 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:57.156809 1707070 cri.go:89] found id: ""
	I1124 09:28:57.156823 1707070 logs.go:282] 0 containers: []
	W1124 09:28:57.156830 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:57.156835 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:57.156898 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:57.182649 1707070 cri.go:89] found id: ""
	I1124 09:28:57.182663 1707070 logs.go:282] 0 containers: []
	W1124 09:28:57.182670 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:57.182676 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:57.182733 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:57.206184 1707070 cri.go:89] found id: ""
	I1124 09:28:57.206198 1707070 logs.go:282] 0 containers: []
	W1124 09:28:57.206205 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:57.206211 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:57.206275 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:57.230629 1707070 cri.go:89] found id: ""
	I1124 09:28:57.230643 1707070 logs.go:282] 0 containers: []
	W1124 09:28:57.230651 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:57.230660 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:57.230670 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:57.287168 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:57.287187 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:57.304021 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:57.304037 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:57.368613 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:57.361126   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:57.361623   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:57.363259   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:57.363659   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:57.365140   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:57.361126   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:57.361623   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:57.363259   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:57.363659   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:57.365140   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:57.368624 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:57.368635 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:57.439834 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:57.439854 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:59.971306 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:59.982006 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:59.982066 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:00.016934 1707070 cri.go:89] found id: ""
	I1124 09:29:00.016951 1707070 logs.go:282] 0 containers: []
	W1124 09:29:00.016966 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:00.016973 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:00.017049 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:00.103638 1707070 cri.go:89] found id: ""
	I1124 09:29:00.103654 1707070 logs.go:282] 0 containers: []
	W1124 09:29:00.103663 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:00.103669 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:00.103740 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:00.170246 1707070 cri.go:89] found id: ""
	I1124 09:29:00.170264 1707070 logs.go:282] 0 containers: []
	W1124 09:29:00.170273 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:00.170280 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:00.170350 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:00.236365 1707070 cri.go:89] found id: ""
	I1124 09:29:00.236382 1707070 logs.go:282] 0 containers: []
	W1124 09:29:00.236390 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:00.236397 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:00.236474 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:00.304007 1707070 cri.go:89] found id: ""
	I1124 09:29:00.304026 1707070 logs.go:282] 0 containers: []
	W1124 09:29:00.304036 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:00.304048 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:00.304139 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:00.347892 1707070 cri.go:89] found id: ""
	I1124 09:29:00.347907 1707070 logs.go:282] 0 containers: []
	W1124 09:29:00.347916 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:00.347924 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:00.348047 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:00.392276 1707070 cri.go:89] found id: ""
	I1124 09:29:00.392292 1707070 logs.go:282] 0 containers: []
	W1124 09:29:00.392304 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:00.392314 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:00.392328 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:00.445097 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:00.445118 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:00.507903 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:00.507923 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:00.532762 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:00.532787 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:00.603329 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:00.595058   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:00.595595   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:00.597748   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:00.598425   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:00.599635   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:00.595058   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:00.595595   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:00.597748   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:00.598425   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:00.599635   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:00.603341 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:00.603352 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:03.164630 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:03.174868 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:03.174928 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:03.198952 1707070 cri.go:89] found id: ""
	I1124 09:29:03.198966 1707070 logs.go:282] 0 containers: []
	W1124 09:29:03.198973 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:03.198979 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:03.199038 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:03.228049 1707070 cri.go:89] found id: ""
	I1124 09:29:03.228063 1707070 logs.go:282] 0 containers: []
	W1124 09:29:03.228070 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:03.228075 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:03.228133 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:03.253873 1707070 cri.go:89] found id: ""
	I1124 09:29:03.253888 1707070 logs.go:282] 0 containers: []
	W1124 09:29:03.253895 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:03.253901 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:03.253969 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:03.277874 1707070 cri.go:89] found id: ""
	I1124 09:29:03.277889 1707070 logs.go:282] 0 containers: []
	W1124 09:29:03.277903 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:03.277909 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:03.277966 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:03.306311 1707070 cri.go:89] found id: ""
	I1124 09:29:03.306333 1707070 logs.go:282] 0 containers: []
	W1124 09:29:03.306340 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:03.306345 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:03.306402 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:03.330412 1707070 cri.go:89] found id: ""
	I1124 09:29:03.330425 1707070 logs.go:282] 0 containers: []
	W1124 09:29:03.330432 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:03.330438 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:03.330572 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:03.359087 1707070 cri.go:89] found id: ""
	I1124 09:29:03.359101 1707070 logs.go:282] 0 containers: []
	W1124 09:29:03.359108 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:03.359116 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:03.359125 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:03.430996 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:03.431015 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:03.467444 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:03.467460 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:03.526316 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:03.526336 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:03.543233 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:03.543250 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:03.605146 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:03.596435   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:03.597161   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:03.598917   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:03.599598   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:03.601425   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:03.596435   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:03.597161   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:03.598917   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:03.599598   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:03.601425   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:06.105406 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:06.116034 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:06.116093 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:06.140111 1707070 cri.go:89] found id: ""
	I1124 09:29:06.140125 1707070 logs.go:282] 0 containers: []
	W1124 09:29:06.140132 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:06.140137 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:06.140195 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:06.164893 1707070 cri.go:89] found id: ""
	I1124 09:29:06.164907 1707070 logs.go:282] 0 containers: []
	W1124 09:29:06.164914 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:06.164920 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:06.164979 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:06.190122 1707070 cri.go:89] found id: ""
	I1124 09:29:06.190137 1707070 logs.go:282] 0 containers: []
	W1124 09:29:06.190144 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:06.190149 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:06.190206 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:06.215548 1707070 cri.go:89] found id: ""
	I1124 09:29:06.215562 1707070 logs.go:282] 0 containers: []
	W1124 09:29:06.215569 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:06.215575 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:06.215630 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:06.239566 1707070 cri.go:89] found id: ""
	I1124 09:29:06.239592 1707070 logs.go:282] 0 containers: []
	W1124 09:29:06.239600 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:06.239605 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:06.239662 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:06.266190 1707070 cri.go:89] found id: ""
	I1124 09:29:06.266223 1707070 logs.go:282] 0 containers: []
	W1124 09:29:06.266232 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:06.266237 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:06.266301 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:06.289910 1707070 cri.go:89] found id: ""
	I1124 09:29:06.289923 1707070 logs.go:282] 0 containers: []
	W1124 09:29:06.289930 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:06.289939 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:06.289955 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:06.353044 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:06.345412   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:06.345855   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:06.347499   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:06.347885   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:06.349511   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:06.345412   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:06.345855   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:06.347499   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:06.347885   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:06.349511   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:06.353054 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:06.353068 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:06.420094 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:06.420114 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:06.452708 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:06.452724 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:06.508689 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:06.508708 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:09.026433 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:09.036862 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:09.036926 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:09.061951 1707070 cri.go:89] found id: ""
	I1124 09:29:09.061965 1707070 logs.go:282] 0 containers: []
	W1124 09:29:09.061972 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:09.061977 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:09.062035 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:09.087954 1707070 cri.go:89] found id: ""
	I1124 09:29:09.087968 1707070 logs.go:282] 0 containers: []
	W1124 09:29:09.087976 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:09.087981 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:09.088044 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:09.112784 1707070 cri.go:89] found id: ""
	I1124 09:29:09.112798 1707070 logs.go:282] 0 containers: []
	W1124 09:29:09.112805 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:09.112810 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:09.112869 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:09.137324 1707070 cri.go:89] found id: ""
	I1124 09:29:09.137339 1707070 logs.go:282] 0 containers: []
	W1124 09:29:09.137347 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:09.137353 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:09.137413 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:09.162408 1707070 cri.go:89] found id: ""
	I1124 09:29:09.162422 1707070 logs.go:282] 0 containers: []
	W1124 09:29:09.162430 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:09.162435 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:09.162513 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:09.191279 1707070 cri.go:89] found id: ""
	I1124 09:29:09.191293 1707070 logs.go:282] 0 containers: []
	W1124 09:29:09.191300 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:09.191305 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:09.191361 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:09.214616 1707070 cri.go:89] found id: ""
	I1124 09:29:09.214630 1707070 logs.go:282] 0 containers: []
	W1124 09:29:09.214637 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:09.214645 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:09.214657 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:09.270146 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:09.270164 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:09.287320 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:09.287340 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:09.352488 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:09.344015   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:09.344642   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:09.346617   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:09.347280   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:09.348952   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:09.344015   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:09.344642   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:09.346617   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:09.347280   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:09.348952   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:09.352499 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:09.352510 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:09.418511 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:09.418532 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:11.954969 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:11.967024 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:11.967089 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:11.990717 1707070 cri.go:89] found id: ""
	I1124 09:29:11.990733 1707070 logs.go:282] 0 containers: []
	W1124 09:29:11.990741 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:11.990746 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:11.990809 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:12.020399 1707070 cri.go:89] found id: ""
	I1124 09:29:12.020413 1707070 logs.go:282] 0 containers: []
	W1124 09:29:12.020421 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:12.020427 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:12.020495 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:12.047081 1707070 cri.go:89] found id: ""
	I1124 09:29:12.047105 1707070 logs.go:282] 0 containers: []
	W1124 09:29:12.047114 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:12.047120 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:12.047185 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:12.072046 1707070 cri.go:89] found id: ""
	I1124 09:29:12.072060 1707070 logs.go:282] 0 containers: []
	W1124 09:29:12.072068 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:12.072074 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:12.072131 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:12.103533 1707070 cri.go:89] found id: ""
	I1124 09:29:12.103547 1707070 logs.go:282] 0 containers: []
	W1124 09:29:12.103554 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:12.103559 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:12.103619 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:12.131885 1707070 cri.go:89] found id: ""
	I1124 09:29:12.131900 1707070 logs.go:282] 0 containers: []
	W1124 09:29:12.131908 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:12.131914 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:12.131977 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:12.156166 1707070 cri.go:89] found id: ""
	I1124 09:29:12.156180 1707070 logs.go:282] 0 containers: []
	W1124 09:29:12.156187 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:12.156195 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:12.156206 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:12.184115 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:12.184131 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:12.239534 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:12.239553 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:12.256920 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:12.256937 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:12.322513 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:12.315053   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:12.315552   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:12.317173   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:12.317659   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:12.319113   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:12.315053   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:12.315552   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:12.317173   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:12.317659   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:12.319113   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:12.322536 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:12.322546 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:14.891198 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:14.901386 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:14.901446 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:14.926318 1707070 cri.go:89] found id: ""
	I1124 09:29:14.926340 1707070 logs.go:282] 0 containers: []
	W1124 09:29:14.926347 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:14.926353 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:14.926413 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:14.955083 1707070 cri.go:89] found id: ""
	I1124 09:29:14.955097 1707070 logs.go:282] 0 containers: []
	W1124 09:29:14.955104 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:14.955110 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:14.955167 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:14.979745 1707070 cri.go:89] found id: ""
	I1124 09:29:14.979758 1707070 logs.go:282] 0 containers: []
	W1124 09:29:14.979766 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:14.979771 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:14.979829 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:15.004845 1707070 cri.go:89] found id: ""
	I1124 09:29:15.004861 1707070 logs.go:282] 0 containers: []
	W1124 09:29:15.004869 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:15.004875 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:15.004952 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:15.044211 1707070 cri.go:89] found id: ""
	I1124 09:29:15.044225 1707070 logs.go:282] 0 containers: []
	W1124 09:29:15.044237 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:15.044243 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:15.044330 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:15.075656 1707070 cri.go:89] found id: ""
	I1124 09:29:15.075669 1707070 logs.go:282] 0 containers: []
	W1124 09:29:15.075677 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:15.075682 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:15.075740 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:15.101378 1707070 cri.go:89] found id: ""
	I1124 09:29:15.101392 1707070 logs.go:282] 0 containers: []
	W1124 09:29:15.101400 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:15.101408 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:15.101418 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:15.159297 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:15.159316 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:15.176523 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:15.176541 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:15.242899 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:15.234359   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:15.235294   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:15.237104   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:15.237675   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:15.239169   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:15.234359   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:15.235294   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:15.237104   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:15.237675   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:15.239169   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:15.242909 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:15.242919 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:15.304297 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:15.304319 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:17.833530 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:17.843418 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:17.843476 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:17.867779 1707070 cri.go:89] found id: ""
	I1124 09:29:17.867793 1707070 logs.go:282] 0 containers: []
	W1124 09:29:17.867806 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:17.867811 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:17.867866 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:17.891077 1707070 cri.go:89] found id: ""
	I1124 09:29:17.891090 1707070 logs.go:282] 0 containers: []
	W1124 09:29:17.891098 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:17.891103 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:17.891187 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:17.915275 1707070 cri.go:89] found id: ""
	I1124 09:29:17.915289 1707070 logs.go:282] 0 containers: []
	W1124 09:29:17.915296 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:17.915301 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:17.915357 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:17.943098 1707070 cri.go:89] found id: ""
	I1124 09:29:17.943111 1707070 logs.go:282] 0 containers: []
	W1124 09:29:17.943119 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:17.943124 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:17.943186 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:17.968417 1707070 cri.go:89] found id: ""
	I1124 09:29:17.968430 1707070 logs.go:282] 0 containers: []
	W1124 09:29:17.968437 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:17.968443 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:17.968501 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:17.993301 1707070 cri.go:89] found id: ""
	I1124 09:29:17.993315 1707070 logs.go:282] 0 containers: []
	W1124 09:29:17.993322 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:17.993328 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:17.993385 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:18.021715 1707070 cri.go:89] found id: ""
	I1124 09:29:18.021730 1707070 logs.go:282] 0 containers: []
	W1124 09:29:18.021738 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:18.021746 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:18.021756 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:18.085324 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:18.085345 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:18.118128 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:18.118159 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:18.182148 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:18.182171 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:18.199970 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:18.199990 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:18.266928 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:18.258137   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:18.258818   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:18.260418   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:18.261036   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:18.262678   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:18.258137   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:18.258818   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:18.260418   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:18.261036   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:18.262678   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:20.768145 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:20.780890 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:20.780956 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:20.807227 1707070 cri.go:89] found id: ""
	I1124 09:29:20.807241 1707070 logs.go:282] 0 containers: []
	W1124 09:29:20.807248 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:20.807253 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:20.807317 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:20.836452 1707070 cri.go:89] found id: ""
	I1124 09:29:20.836466 1707070 logs.go:282] 0 containers: []
	W1124 09:29:20.836473 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:20.836478 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:20.836535 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:20.861534 1707070 cri.go:89] found id: ""
	I1124 09:29:20.861549 1707070 logs.go:282] 0 containers: []
	W1124 09:29:20.861556 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:20.861561 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:20.861620 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:20.890181 1707070 cri.go:89] found id: ""
	I1124 09:29:20.890196 1707070 logs.go:282] 0 containers: []
	W1124 09:29:20.890203 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:20.890209 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:20.890278 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:20.919882 1707070 cri.go:89] found id: ""
	I1124 09:29:20.919897 1707070 logs.go:282] 0 containers: []
	W1124 09:29:20.919904 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:20.919910 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:20.919973 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:20.948347 1707070 cri.go:89] found id: ""
	I1124 09:29:20.948361 1707070 logs.go:282] 0 containers: []
	W1124 09:29:20.948368 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:20.948373 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:20.948428 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:20.972834 1707070 cri.go:89] found id: ""
	I1124 09:29:20.972847 1707070 logs.go:282] 0 containers: []
	W1124 09:29:20.972855 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:20.972862 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:20.972873 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:21.029330 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:21.029350 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:21.046983 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:21.047000 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:21.112004 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:21.104171   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:21.104918   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:21.106573   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:21.107127   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:21.108653   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:21.104171   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:21.104918   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:21.106573   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:21.107127   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:21.108653   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:21.112015 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:21.112025 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:21.174850 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:21.174870 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:23.702609 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:23.712856 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:23.712939 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:23.741964 1707070 cri.go:89] found id: ""
	I1124 09:29:23.741978 1707070 logs.go:282] 0 containers: []
	W1124 09:29:23.741985 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:23.741991 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:23.742067 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:23.766952 1707070 cri.go:89] found id: ""
	I1124 09:29:23.766966 1707070 logs.go:282] 0 containers: []
	W1124 09:29:23.766972 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:23.766978 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:23.767035 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:23.790992 1707070 cri.go:89] found id: ""
	I1124 09:29:23.791005 1707070 logs.go:282] 0 containers: []
	W1124 09:29:23.791013 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:23.791018 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:23.791073 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:23.819700 1707070 cri.go:89] found id: ""
	I1124 09:29:23.819713 1707070 logs.go:282] 0 containers: []
	W1124 09:29:23.819720 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:23.819726 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:23.819786 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:23.848657 1707070 cri.go:89] found id: ""
	I1124 09:29:23.848683 1707070 logs.go:282] 0 containers: []
	W1124 09:29:23.848690 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:23.848695 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:23.848754 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:23.873546 1707070 cri.go:89] found id: ""
	I1124 09:29:23.873571 1707070 logs.go:282] 0 containers: []
	W1124 09:29:23.873578 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:23.873584 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:23.873654 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:23.899519 1707070 cri.go:89] found id: ""
	I1124 09:29:23.899533 1707070 logs.go:282] 0 containers: []
	W1124 09:29:23.899547 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:23.899556 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:23.899568 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:23.954834 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:23.954854 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:23.971662 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:23.971680 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:24.041660 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:24.033560   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:24.034352   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:24.036062   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:24.036417   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:24.038032   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:24.033560   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:24.034352   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:24.036062   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:24.036417   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:24.038032   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:24.041670 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:24.041681 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:24.105146 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:24.105168 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:26.634760 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:26.646166 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:26.646251 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:26.679257 1707070 cri.go:89] found id: ""
	I1124 09:29:26.679271 1707070 logs.go:282] 0 containers: []
	W1124 09:29:26.679279 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:26.679284 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:26.679344 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:26.710754 1707070 cri.go:89] found id: ""
	I1124 09:29:26.710768 1707070 logs.go:282] 0 containers: []
	W1124 09:29:26.710775 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:26.710782 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:26.710840 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:26.735831 1707070 cri.go:89] found id: ""
	I1124 09:29:26.735845 1707070 logs.go:282] 0 containers: []
	W1124 09:29:26.735852 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:26.735857 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:26.735926 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:26.759918 1707070 cri.go:89] found id: ""
	I1124 09:29:26.759932 1707070 logs.go:282] 0 containers: []
	W1124 09:29:26.759939 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:26.759947 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:26.760002 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:26.783806 1707070 cri.go:89] found id: ""
	I1124 09:29:26.783825 1707070 logs.go:282] 0 containers: []
	W1124 09:29:26.783832 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:26.783838 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:26.783895 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:26.809230 1707070 cri.go:89] found id: ""
	I1124 09:29:26.809244 1707070 logs.go:282] 0 containers: []
	W1124 09:29:26.809252 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:26.809266 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:26.809331 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:26.836902 1707070 cri.go:89] found id: ""
	I1124 09:29:26.836916 1707070 logs.go:282] 0 containers: []
	W1124 09:29:26.836923 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:26.836931 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:26.836942 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:26.853955 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:26.853978 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:26.916186 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:26.907929   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:26.908672   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:26.910345   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:26.910937   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:26.912681   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:26.907929   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:26.908672   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:26.910345   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:26.910937   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:26.912681   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:26.916196 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:26.916218 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:26.980050 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:26.980072 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:27.010821 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:27.010838 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:29.573482 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:29.583518 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:29.583582 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:29.608188 1707070 cri.go:89] found id: ""
	I1124 09:29:29.608202 1707070 logs.go:282] 0 containers: []
	W1124 09:29:29.608209 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:29.608214 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:29.608270 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:29.641187 1707070 cri.go:89] found id: ""
	I1124 09:29:29.641201 1707070 logs.go:282] 0 containers: []
	W1124 09:29:29.641209 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:29.641214 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:29.641282 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:29.674249 1707070 cri.go:89] found id: ""
	I1124 09:29:29.674269 1707070 logs.go:282] 0 containers: []
	W1124 09:29:29.674276 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:29.674282 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:29.674339 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:29.700355 1707070 cri.go:89] found id: ""
	I1124 09:29:29.700370 1707070 logs.go:282] 0 containers: []
	W1124 09:29:29.700377 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:29.700382 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:29.700438 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:29.729232 1707070 cri.go:89] found id: ""
	I1124 09:29:29.729246 1707070 logs.go:282] 0 containers: []
	W1124 09:29:29.729253 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:29.729257 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:29.729313 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:29.756753 1707070 cri.go:89] found id: ""
	I1124 09:29:29.756766 1707070 logs.go:282] 0 containers: []
	W1124 09:29:29.756773 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:29.756788 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:29.756849 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:29.782318 1707070 cri.go:89] found id: ""
	I1124 09:29:29.782332 1707070 logs.go:282] 0 containers: []
	W1124 09:29:29.782339 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:29.782347 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:29.782358 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:29.837944 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:29.837963 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:29.855075 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:29.855094 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:29.916212 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:29.907972   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:29.908745   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:29.910447   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:29.910972   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:29.912670   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:29.907972   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:29.908745   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:29.910447   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:29.910972   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:29.912670   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:29.916221 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:29.916232 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:29.978681 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:29.978703 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:32.530833 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:32.541146 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:32.541251 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:32.566525 1707070 cri.go:89] found id: ""
	I1124 09:29:32.566540 1707070 logs.go:282] 0 containers: []
	W1124 09:29:32.566548 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:32.566554 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:32.566622 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:32.591741 1707070 cri.go:89] found id: ""
	I1124 09:29:32.591756 1707070 logs.go:282] 0 containers: []
	W1124 09:29:32.591763 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:32.591768 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:32.591826 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:32.617127 1707070 cri.go:89] found id: ""
	I1124 09:29:32.617141 1707070 logs.go:282] 0 containers: []
	W1124 09:29:32.617148 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:32.617153 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:32.617209 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:32.654493 1707070 cri.go:89] found id: ""
	I1124 09:29:32.654507 1707070 logs.go:282] 0 containers: []
	W1124 09:29:32.654515 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:32.654521 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:32.654580 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:32.685080 1707070 cri.go:89] found id: ""
	I1124 09:29:32.685094 1707070 logs.go:282] 0 containers: []
	W1124 09:29:32.685101 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:32.685106 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:32.685180 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:32.715751 1707070 cri.go:89] found id: ""
	I1124 09:29:32.715766 1707070 logs.go:282] 0 containers: []
	W1124 09:29:32.715782 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:32.715788 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:32.715850 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:32.742395 1707070 cri.go:89] found id: ""
	I1124 09:29:32.742409 1707070 logs.go:282] 0 containers: []
	W1124 09:29:32.742416 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:32.742424 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:32.742434 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:32.760261 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:32.760278 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:32.828736 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:32.819577   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:32.820328   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:32.822013   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:32.822622   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:32.824506   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:32.819577   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:32.820328   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:32.822013   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:32.822622   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:32.824506   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:32.828746 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:32.828759 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:32.896940 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:32.896965 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:32.928695 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:32.928711 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:35.485941 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:35.496873 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:35.496934 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:35.525748 1707070 cri.go:89] found id: ""
	I1124 09:29:35.525782 1707070 logs.go:282] 0 containers: []
	W1124 09:29:35.525791 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:35.525796 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:35.525866 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:35.553111 1707070 cri.go:89] found id: ""
	I1124 09:29:35.553126 1707070 logs.go:282] 0 containers: []
	W1124 09:29:35.553134 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:35.553142 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:35.553220 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:35.578594 1707070 cri.go:89] found id: ""
	I1124 09:29:35.578622 1707070 logs.go:282] 0 containers: []
	W1124 09:29:35.578629 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:35.578635 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:35.578706 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:35.607322 1707070 cri.go:89] found id: ""
	I1124 09:29:35.607336 1707070 logs.go:282] 0 containers: []
	W1124 09:29:35.607343 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:35.607348 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:35.607417 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:35.638865 1707070 cri.go:89] found id: ""
	I1124 09:29:35.638880 1707070 logs.go:282] 0 containers: []
	W1124 09:29:35.638887 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:35.638893 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:35.638960 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:35.672327 1707070 cri.go:89] found id: ""
	I1124 09:29:35.672352 1707070 logs.go:282] 0 containers: []
	W1124 09:29:35.672360 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:35.672365 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:35.672431 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:35.700255 1707070 cri.go:89] found id: ""
	I1124 09:29:35.700269 1707070 logs.go:282] 0 containers: []
	W1124 09:29:35.700277 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:35.700285 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:35.700297 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:35.758017 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:35.758037 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:35.775326 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:35.775344 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:35.842090 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:35.833688   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:35.834400   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:35.836148   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:35.836802   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:35.838521   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:35.833688   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:35.834400   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:35.836148   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:35.836802   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:35.838521   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:35.842100 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:35.842120 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:35.908742 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:35.908769 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:38.443689 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:38.453968 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:38.454035 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:38.477762 1707070 cri.go:89] found id: ""
	I1124 09:29:38.477776 1707070 logs.go:282] 0 containers: []
	W1124 09:29:38.477783 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:38.477789 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:38.477853 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:38.506120 1707070 cri.go:89] found id: ""
	I1124 09:29:38.506134 1707070 logs.go:282] 0 containers: []
	W1124 09:29:38.506141 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:38.506147 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:38.506203 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:38.530669 1707070 cri.go:89] found id: ""
	I1124 09:29:38.530691 1707070 logs.go:282] 0 containers: []
	W1124 09:29:38.530699 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:38.530705 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:38.530763 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:38.560535 1707070 cri.go:89] found id: ""
	I1124 09:29:38.560558 1707070 logs.go:282] 0 containers: []
	W1124 09:29:38.560565 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:38.560572 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:38.560631 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:38.586535 1707070 cri.go:89] found id: ""
	I1124 09:29:38.586549 1707070 logs.go:282] 0 containers: []
	W1124 09:29:38.586556 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:38.586561 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:38.586620 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:38.611101 1707070 cri.go:89] found id: ""
	I1124 09:29:38.611115 1707070 logs.go:282] 0 containers: []
	W1124 09:29:38.611122 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:38.611127 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:38.611186 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:38.643467 1707070 cri.go:89] found id: ""
	I1124 09:29:38.643482 1707070 logs.go:282] 0 containers: []
	W1124 09:29:38.643489 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:38.643497 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:38.643508 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:38.708197 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:38.708218 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:38.725978 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:38.725995 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:38.789806 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:38.781672   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:38.782397   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:38.783993   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:38.784577   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:38.786135   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:38.781672   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:38.782397   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:38.783993   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:38.784577   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:38.786135   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:38.789818 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:38.789828 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:38.853085 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:38.853106 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:41.387044 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:41.398117 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:41.398183 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:41.424537 1707070 cri.go:89] found id: ""
	I1124 09:29:41.424551 1707070 logs.go:282] 0 containers: []
	W1124 09:29:41.424558 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:41.424564 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:41.424626 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:41.454716 1707070 cri.go:89] found id: ""
	I1124 09:29:41.454730 1707070 logs.go:282] 0 containers: []
	W1124 09:29:41.454737 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:41.454742 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:41.454801 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:41.479954 1707070 cri.go:89] found id: ""
	I1124 09:29:41.479969 1707070 logs.go:282] 0 containers: []
	W1124 09:29:41.479976 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:41.479981 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:41.480041 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:41.505560 1707070 cri.go:89] found id: ""
	I1124 09:29:41.505575 1707070 logs.go:282] 0 containers: []
	W1124 09:29:41.505582 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:41.505593 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:41.505654 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:41.530996 1707070 cri.go:89] found id: ""
	I1124 09:29:41.531010 1707070 logs.go:282] 0 containers: []
	W1124 09:29:41.531018 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:41.531024 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:41.531090 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:41.557489 1707070 cri.go:89] found id: ""
	I1124 09:29:41.557502 1707070 logs.go:282] 0 containers: []
	W1124 09:29:41.557510 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:41.557516 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:41.557575 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:41.587178 1707070 cri.go:89] found id: ""
	I1124 09:29:41.587192 1707070 logs.go:282] 0 containers: []
	W1124 09:29:41.587199 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:41.587207 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:41.587217 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:41.644853 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:41.644873 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:41.664905 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:41.664924 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:41.731530 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:41.723947   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:41.724430   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:41.726128   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:41.726440   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:41.727892   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:41.723947   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:41.724430   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:41.726128   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:41.726440   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:41.727892   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:41.731540 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:41.731550 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:41.793965 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:41.793985 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:44.323959 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:44.334291 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:44.334352 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:44.364183 1707070 cri.go:89] found id: ""
	I1124 09:29:44.364199 1707070 logs.go:282] 0 containers: []
	W1124 09:29:44.364206 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:44.364212 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:44.364285 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:44.391116 1707070 cri.go:89] found id: ""
	I1124 09:29:44.391130 1707070 logs.go:282] 0 containers: []
	W1124 09:29:44.391137 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:44.391142 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:44.391199 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:44.416448 1707070 cri.go:89] found id: ""
	I1124 09:29:44.416462 1707070 logs.go:282] 0 containers: []
	W1124 09:29:44.416470 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:44.416476 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:44.416533 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:44.442027 1707070 cri.go:89] found id: ""
	I1124 09:29:44.442042 1707070 logs.go:282] 0 containers: []
	W1124 09:29:44.442059 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:44.442065 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:44.442124 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:44.467492 1707070 cri.go:89] found id: ""
	I1124 09:29:44.467516 1707070 logs.go:282] 0 containers: []
	W1124 09:29:44.467525 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:44.467531 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:44.467643 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:44.492900 1707070 cri.go:89] found id: ""
	I1124 09:29:44.492914 1707070 logs.go:282] 0 containers: []
	W1124 09:29:44.492921 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:44.492927 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:44.492986 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:44.518419 1707070 cri.go:89] found id: ""
	I1124 09:29:44.518434 1707070 logs.go:282] 0 containers: []
	W1124 09:29:44.518441 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:44.518449 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:44.518479 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:44.584407 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:44.584427 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:44.616287 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:44.616305 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:44.680013 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:44.680033 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:44.702644 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:44.702662 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:44.770803 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:44.761924   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:44.762682   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:44.764417   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:44.765036   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:44.766673   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:44.761924   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:44.762682   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:44.764417   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:44.765036   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:44.766673   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:47.271699 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:47.283580 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:47.283646 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:47.309341 1707070 cri.go:89] found id: ""
	I1124 09:29:47.309355 1707070 logs.go:282] 0 containers: []
	W1124 09:29:47.309368 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:47.309385 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:47.309443 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:47.335187 1707070 cri.go:89] found id: ""
	I1124 09:29:47.335202 1707070 logs.go:282] 0 containers: []
	W1124 09:29:47.335209 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:47.335214 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:47.335273 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:47.362876 1707070 cri.go:89] found id: ""
	I1124 09:29:47.362891 1707070 logs.go:282] 0 containers: []
	W1124 09:29:47.362898 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:47.362904 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:47.362964 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:47.388290 1707070 cri.go:89] found id: ""
	I1124 09:29:47.388304 1707070 logs.go:282] 0 containers: []
	W1124 09:29:47.388311 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:47.388317 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:47.388374 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:47.416544 1707070 cri.go:89] found id: ""
	I1124 09:29:47.416558 1707070 logs.go:282] 0 containers: []
	W1124 09:29:47.416565 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:47.416570 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:47.416629 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:47.441861 1707070 cri.go:89] found id: ""
	I1124 09:29:47.441875 1707070 logs.go:282] 0 containers: []
	W1124 09:29:47.441902 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:47.441909 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:47.441978 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:47.465857 1707070 cri.go:89] found id: ""
	I1124 09:29:47.465879 1707070 logs.go:282] 0 containers: []
	W1124 09:29:47.465886 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:47.465894 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:47.465905 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:47.523429 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:47.523450 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:47.540445 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:47.540462 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:47.607683 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:47.599524   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:47.600165   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:47.601865   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:47.602402   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:47.603965   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:47.599524   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:47.600165   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:47.601865   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:47.602402   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:47.603965   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:47.607694 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:47.607704 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:47.682000 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:47.682023 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:50.218599 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:50.229182 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:50.229254 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:50.254129 1707070 cri.go:89] found id: ""
	I1124 09:29:50.254143 1707070 logs.go:282] 0 containers: []
	W1124 09:29:50.254150 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:50.254155 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:50.254219 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:50.280233 1707070 cri.go:89] found id: ""
	I1124 09:29:50.280247 1707070 logs.go:282] 0 containers: []
	W1124 09:29:50.280254 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:50.280260 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:50.280317 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:50.304403 1707070 cri.go:89] found id: ""
	I1124 09:29:50.304417 1707070 logs.go:282] 0 containers: []
	W1124 09:29:50.304424 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:50.304430 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:50.304492 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:50.329881 1707070 cri.go:89] found id: ""
	I1124 09:29:50.329897 1707070 logs.go:282] 0 containers: []
	W1124 09:29:50.329904 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:50.329910 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:50.329987 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:50.358124 1707070 cri.go:89] found id: ""
	I1124 09:29:50.358139 1707070 logs.go:282] 0 containers: []
	W1124 09:29:50.358149 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:50.358158 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:50.358246 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:50.384151 1707070 cri.go:89] found id: ""
	I1124 09:29:50.384165 1707070 logs.go:282] 0 containers: []
	W1124 09:29:50.384178 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:50.384196 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:50.384254 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:50.408884 1707070 cri.go:89] found id: ""
	I1124 09:29:50.408899 1707070 logs.go:282] 0 containers: []
	W1124 09:29:50.408906 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:50.408914 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:50.408925 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:50.464122 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:50.464147 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:50.480720 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:50.480736 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:50.544337 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:50.536334   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:50.536956   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:50.538555   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:50.539042   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:50.540634   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:50.536334   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:50.536956   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:50.538555   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:50.539042   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:50.540634   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:50.544348 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:50.544361 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:50.606972 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:50.606993 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:53.143446 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:53.154359 1707070 kubeadm.go:602] duration metric: took 4m4.065975367s to restartPrimaryControlPlane
	W1124 09:29:53.154423 1707070 out.go:285] ! Unable to restart control-plane node(s), will reset cluster: <no value>
	I1124 09:29:53.154529 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1124 09:29:53.563147 1707070 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1124 09:29:53.576942 1707070 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1124 09:29:53.584698 1707070 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1124 09:29:53.584758 1707070 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1124 09:29:53.592605 1707070 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1124 09:29:53.592613 1707070 kubeadm.go:158] found existing configuration files:
	
	I1124 09:29:53.592678 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1124 09:29:53.600460 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1124 09:29:53.600517 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1124 09:29:53.607615 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1124 09:29:53.615236 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1124 09:29:53.615293 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1124 09:29:53.622532 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1124 09:29:53.630501 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1124 09:29:53.630562 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1124 09:29:53.638386 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1124 09:29:53.646257 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1124 09:29:53.646321 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1124 09:29:53.653836 1707070 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1124 09:29:53.692708 1707070 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-beta.0
	I1124 09:29:53.692756 1707070 kubeadm.go:319] [preflight] Running pre-flight checks
	I1124 09:29:53.765347 1707070 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1124 09:29:53.765413 1707070 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1124 09:29:53.765447 1707070 kubeadm.go:319] OS: Linux
	I1124 09:29:53.765490 1707070 kubeadm.go:319] CGROUPS_CPU: enabled
	I1124 09:29:53.765537 1707070 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1124 09:29:53.765589 1707070 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1124 09:29:53.765636 1707070 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1124 09:29:53.765682 1707070 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1124 09:29:53.765729 1707070 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1124 09:29:53.765772 1707070 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1124 09:29:53.765819 1707070 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1124 09:29:53.765864 1707070 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1124 09:29:53.828877 1707070 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1124 09:29:53.829001 1707070 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1124 09:29:53.829104 1707070 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1124 09:29:53.834791 1707070 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1124 09:29:53.838245 1707070 out.go:252]   - Generating certificates and keys ...
	I1124 09:29:53.838369 1707070 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1124 09:29:53.838434 1707070 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1124 09:29:53.838527 1707070 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1124 09:29:53.838616 1707070 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1124 09:29:53.838701 1707070 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1124 09:29:53.838784 1707070 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1124 09:29:53.838854 1707070 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1124 09:29:53.838919 1707070 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1124 09:29:53.839002 1707070 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1124 09:29:53.839386 1707070 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1124 09:29:53.839639 1707070 kubeadm.go:319] [certs] Using the existing "sa" key
	I1124 09:29:53.839706 1707070 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1124 09:29:54.545063 1707070 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1124 09:29:55.036514 1707070 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1124 09:29:55.148786 1707070 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1124 09:29:55.311399 1707070 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1124 09:29:55.656188 1707070 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1124 09:29:55.656996 1707070 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1124 09:29:55.659590 1707070 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1124 09:29:55.662658 1707070 out.go:252]   - Booting up control plane ...
	I1124 09:29:55.662786 1707070 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1124 09:29:55.662870 1707070 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1124 09:29:55.664747 1707070 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1124 09:29:55.686536 1707070 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1124 09:29:55.686657 1707070 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1124 09:29:55.694440 1707070 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1124 09:29:55.694885 1707070 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1124 09:29:55.694934 1707070 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1124 09:29:55.830944 1707070 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1124 09:29:55.831051 1707070 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1124 09:33:55.829210 1707070 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000251849s
	I1124 09:33:55.829235 1707070 kubeadm.go:319] 
	I1124 09:33:55.829291 1707070 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1124 09:33:55.829323 1707070 kubeadm.go:319] 	- The kubelet is not running
	I1124 09:33:55.829428 1707070 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1124 09:33:55.829432 1707070 kubeadm.go:319] 
	I1124 09:33:55.829536 1707070 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1124 09:33:55.829573 1707070 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1124 09:33:55.829603 1707070 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1124 09:33:55.829606 1707070 kubeadm.go:319] 
	I1124 09:33:55.833661 1707070 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1124 09:33:55.834099 1707070 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1124 09:33:55.834220 1707070 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1124 09:33:55.834508 1707070 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1124 09:33:55.834517 1707070 kubeadm.go:319] 
	I1124 09:33:55.834670 1707070 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	W1124 09:33:55.834735 1707070 out.go:285] ! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000251849s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	I1124 09:33:55.834825 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1124 09:33:56.243415 1707070 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1124 09:33:56.256462 1707070 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1124 09:33:56.256517 1707070 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1124 09:33:56.264387 1707070 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1124 09:33:56.264397 1707070 kubeadm.go:158] found existing configuration files:
	
	I1124 09:33:56.264448 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1124 09:33:56.272152 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1124 09:33:56.272210 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1124 09:33:56.279938 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1124 09:33:56.287667 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1124 09:33:56.287720 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1124 09:33:56.295096 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1124 09:33:56.302699 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1124 09:33:56.302758 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1124 09:33:56.310421 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1124 09:33:56.318128 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1124 09:33:56.318183 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1124 09:33:56.325438 1707070 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1124 09:33:56.364513 1707070 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-beta.0
	I1124 09:33:56.364563 1707070 kubeadm.go:319] [preflight] Running pre-flight checks
	I1124 09:33:56.440273 1707070 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1124 09:33:56.440340 1707070 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1124 09:33:56.440376 1707070 kubeadm.go:319] OS: Linux
	I1124 09:33:56.440420 1707070 kubeadm.go:319] CGROUPS_CPU: enabled
	I1124 09:33:56.440467 1707070 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1124 09:33:56.440513 1707070 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1124 09:33:56.440560 1707070 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1124 09:33:56.440606 1707070 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1124 09:33:56.440654 1707070 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1124 09:33:56.440697 1707070 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1124 09:33:56.440749 1707070 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1124 09:33:56.440794 1707070 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1124 09:33:56.504487 1707070 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1124 09:33:56.504590 1707070 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1124 09:33:56.504685 1707070 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1124 09:33:56.510220 1707070 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1124 09:33:56.513847 1707070 out.go:252]   - Generating certificates and keys ...
	I1124 09:33:56.513936 1707070 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1124 09:33:56.514003 1707070 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1124 09:33:56.514078 1707070 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1124 09:33:56.514137 1707070 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1124 09:33:56.514205 1707070 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1124 09:33:56.514264 1707070 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1124 09:33:56.514326 1707070 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1124 09:33:56.514386 1707070 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1124 09:33:56.514481 1707070 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1124 09:33:56.514553 1707070 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1124 09:33:56.514589 1707070 kubeadm.go:319] [certs] Using the existing "sa" key
	I1124 09:33:56.514644 1707070 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1124 09:33:57.046366 1707070 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1124 09:33:57.432965 1707070 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1124 09:33:57.802873 1707070 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1124 09:33:58.414576 1707070 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1124 09:33:58.520825 1707070 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1124 09:33:58.522049 1707070 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1124 09:33:58.526436 1707070 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1124 09:33:58.529676 1707070 out.go:252]   - Booting up control plane ...
	I1124 09:33:58.529779 1707070 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1124 09:33:58.529855 1707070 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1124 09:33:58.529921 1707070 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1124 09:33:58.549683 1707070 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1124 09:33:58.549801 1707070 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1124 09:33:58.557327 1707070 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1124 09:33:58.557589 1707070 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1124 09:33:58.557812 1707070 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1124 09:33:58.696439 1707070 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1124 09:33:58.696553 1707070 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1124 09:37:58.697446 1707070 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.001230859s
	I1124 09:37:58.697472 1707070 kubeadm.go:319] 
	I1124 09:37:58.697558 1707070 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1124 09:37:58.697602 1707070 kubeadm.go:319] 	- The kubelet is not running
	I1124 09:37:58.697730 1707070 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1124 09:37:58.697737 1707070 kubeadm.go:319] 
	I1124 09:37:58.697847 1707070 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1124 09:37:58.697878 1707070 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1124 09:37:58.697921 1707070 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1124 09:37:58.697925 1707070 kubeadm.go:319] 
	I1124 09:37:58.701577 1707070 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1124 09:37:58.701990 1707070 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1124 09:37:58.702104 1707070 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1124 09:37:58.702344 1707070 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1124 09:37:58.702350 1707070 kubeadm.go:319] 
	I1124 09:37:58.702417 1707070 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1124 09:37:58.702481 1707070 kubeadm.go:403] duration metric: took 12m9.652556415s to StartCluster
	I1124 09:37:58.702514 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:37:58.702578 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:37:58.726968 1707070 cri.go:89] found id: ""
	I1124 09:37:58.726981 1707070 logs.go:282] 0 containers: []
	W1124 09:37:58.726988 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:37:58.726994 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:37:58.727055 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:37:58.756184 1707070 cri.go:89] found id: ""
	I1124 09:37:58.756198 1707070 logs.go:282] 0 containers: []
	W1124 09:37:58.756205 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:37:58.756210 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:37:58.756266 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:37:58.781056 1707070 cri.go:89] found id: ""
	I1124 09:37:58.781070 1707070 logs.go:282] 0 containers: []
	W1124 09:37:58.781077 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:37:58.781082 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:37:58.781145 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:37:58.805769 1707070 cri.go:89] found id: ""
	I1124 09:37:58.805783 1707070 logs.go:282] 0 containers: []
	W1124 09:37:58.805790 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:37:58.805796 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:37:58.805854 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:37:58.830758 1707070 cri.go:89] found id: ""
	I1124 09:37:58.830780 1707070 logs.go:282] 0 containers: []
	W1124 09:37:58.830791 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:37:58.830797 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:37:58.830857 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:37:58.855967 1707070 cri.go:89] found id: ""
	I1124 09:37:58.855981 1707070 logs.go:282] 0 containers: []
	W1124 09:37:58.855988 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:37:58.855994 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:37:58.856051 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:37:58.890842 1707070 cri.go:89] found id: ""
	I1124 09:37:58.890857 1707070 logs.go:282] 0 containers: []
	W1124 09:37:58.890865 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:37:58.890873 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:37:58.890885 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:37:58.910142 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:37:58.910157 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:37:58.985463 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:37:58.976283   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:37:58.977104   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:37:58.978904   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:37:58.979496   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:37:58.981268   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:37:58.976283   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:37:58.977104   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:37:58.978904   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:37:58.979496   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:37:58.981268   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:37:58.985474 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:37:58.985486 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:37:59.051823 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:37:59.051845 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:37:59.080123 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:37:59.080139 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	W1124 09:37:59.137954 1707070 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001230859s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	W1124 09:37:59.138000 1707070 out.go:285] * 
	W1124 09:37:59.138117 1707070 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001230859s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1124 09:37:59.138177 1707070 out.go:285] * 
	W1124 09:37:59.140306 1707070 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1124 09:37:59.145839 1707070 out.go:203] 
	W1124 09:37:59.149636 1707070 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001230859s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1124 09:37:59.149678 1707070 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1124 09:37:59.149707 1707070 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1124 09:37:59.153358 1707070 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Nov 24 09:38:08 functional-291288 containerd[10324]: time="2025-11-24T09:38:08.742098573Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-291288\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:38:09 functional-291288 containerd[10324]: time="2025-11-24T09:38:09.725115511Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-291288\""
	Nov 24 09:38:09 functional-291288 containerd[10324]: time="2025-11-24T09:38:09.727769003Z" level=info msg="ImageDelete event name:\"docker.io/kicbase/echo-server:functional-291288\""
	Nov 24 09:38:09 functional-291288 containerd[10324]: time="2025-11-24T09:38:09.729992993Z" level=info msg="ImageDelete event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\""
	Nov 24 09:38:09 functional-291288 containerd[10324]: time="2025-11-24T09:38:09.740634129Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-291288\" returns successfully"
	Nov 24 09:38:10 functional-291288 containerd[10324]: time="2025-11-24T09:38:10.017725287Z" level=info msg="No images store for sha256:af1a838d2702e4e84137a83a66ae93ebb59c7bf115bf022cc84ce1a55dfd3fb4"
	Nov 24 09:38:10 functional-291288 containerd[10324]: time="2025-11-24T09:38:10.020247594Z" level=info msg="ImageCreate event name:\"docker.io/kicbase/echo-server:functional-291288\""
	Nov 24 09:38:10 functional-291288 containerd[10324]: time="2025-11-24T09:38:10.028698216Z" level=info msg="ImageCreate event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:38:10 functional-291288 containerd[10324]: time="2025-11-24T09:38:10.029232770Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-291288\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:38:11 functional-291288 containerd[10324]: time="2025-11-24T09:38:11.459119625Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-291288\""
	Nov 24 09:38:11 functional-291288 containerd[10324]: time="2025-11-24T09:38:11.462708306Z" level=info msg="ImageDelete event name:\"docker.io/kicbase/echo-server:functional-291288\""
	Nov 24 09:38:11 functional-291288 containerd[10324]: time="2025-11-24T09:38:11.465197440Z" level=info msg="ImageDelete event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\""
	Nov 24 09:38:11 functional-291288 containerd[10324]: time="2025-11-24T09:38:11.482046877Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-291288\" returns successfully"
	Nov 24 09:38:11 functional-291288 containerd[10324]: time="2025-11-24T09:38:11.784805820Z" level=info msg="No images store for sha256:af1a838d2702e4e84137a83a66ae93ebb59c7bf115bf022cc84ce1a55dfd3fb4"
	Nov 24 09:38:11 functional-291288 containerd[10324]: time="2025-11-24T09:38:11.787158164Z" level=info msg="ImageCreate event name:\"docker.io/kicbase/echo-server:functional-291288\""
	Nov 24 09:38:11 functional-291288 containerd[10324]: time="2025-11-24T09:38:11.795127091Z" level=info msg="ImageCreate event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:38:11 functional-291288 containerd[10324]: time="2025-11-24T09:38:11.795603535Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-291288\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:38:12 functional-291288 containerd[10324]: time="2025-11-24T09:38:12.816798467Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-291288\""
	Nov 24 09:38:12 functional-291288 containerd[10324]: time="2025-11-24T09:38:12.819240765Z" level=info msg="ImageDelete event name:\"docker.io/kicbase/echo-server:functional-291288\""
	Nov 24 09:38:12 functional-291288 containerd[10324]: time="2025-11-24T09:38:12.822304091Z" level=info msg="ImageDelete event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\""
	Nov 24 09:38:12 functional-291288 containerd[10324]: time="2025-11-24T09:38:12.835765777Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-291288\" returns successfully"
	Nov 24 09:38:13 functional-291288 containerd[10324]: time="2025-11-24T09:38:13.649607557Z" level=info msg="No images store for sha256:80154cc39374c5be6259fccbd4295ce399d3a1d7b6e10b99200044587775c910"
	Nov 24 09:38:13 functional-291288 containerd[10324]: time="2025-11-24T09:38:13.651890157Z" level=info msg="ImageCreate event name:\"docker.io/kicbase/echo-server:functional-291288\""
	Nov 24 09:38:13 functional-291288 containerd[10324]: time="2025-11-24T09:38:13.659732716Z" level=info msg="ImageCreate event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:38:13 functional-291288 containerd[10324]: time="2025-11-24T09:38:13.660101507Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-291288\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:39:40.929605   23503 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:39:40.930427   23503 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:39:40.932109   23503 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:39:40.932617   23503 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:39:40.934079   23503 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Nov24 08:20] overlayfs: idmapped layers are currently not supported
	[Nov24 08:21] overlayfs: idmapped layers are currently not supported
	[Nov24 08:22] overlayfs: idmapped layers are currently not supported
	[Nov24 08:23] overlayfs: idmapped layers are currently not supported
	[Nov24 08:24] overlayfs: idmapped layers are currently not supported
	[ +19.566509] overlayfs: idmapped layers are currently not supported
	[Nov24 08:25] overlayfs: idmapped layers are currently not supported
	[ +35.934095] overlayfs: idmapped layers are currently not supported
	[Nov24 08:27] overlayfs: idmapped layers are currently not supported
	[Nov24 08:28] overlayfs: idmapped layers are currently not supported
	[Nov24 08:29] overlayfs: idmapped layers are currently not supported
	[Nov24 08:30] overlayfs: idmapped layers are currently not supported
	[Nov24 08:31] overlayfs: idmapped layers are currently not supported
	[ +44.505180] overlayfs: idmapped layers are currently not supported
	[Nov24 08:32] overlayfs: idmapped layers are currently not supported
	[Nov24 08:33] overlayfs: idmapped layers are currently not supported
	[Nov24 08:34] overlayfs: idmapped layers are currently not supported
	[ +21.137877] overlayfs: idmapped layers are currently not supported
	[ +28.724175] overlayfs: idmapped layers are currently not supported
	[Nov24 08:36] overlayfs: idmapped layers are currently not supported
	[Nov24 08:37] overlayfs: idmapped layers are currently not supported
	[Nov24 08:38] overlayfs: idmapped layers are currently not supported
	[  +4.063834] overlayfs: idmapped layers are currently not supported
	[Nov24 08:40] overlayfs: idmapped layers are currently not supported
	[Nov24 08:42] kauditd_printk_skb: 8 callbacks suppressed
	
	
	==> kernel <==
	 09:39:40 up  8:21,  0 user,  load average: 0.91, 0.40, 0.38
	Linux functional-291288 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Nov 24 09:39:37 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:39:38 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 453.
	Nov 24 09:39:38 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:39:38 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:39:38 functional-291288 kubelet[23392]: E1124 09:39:38.673644   23392 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:39:38 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:39:38 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:39:39 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 454.
	Nov 24 09:39:39 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:39:39 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:39:39 functional-291288 kubelet[23398]: E1124 09:39:39.446326   23398 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:39:39 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:39:39 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:39:40 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 455.
	Nov 24 09:39:40 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:39:40 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:39:40 functional-291288 kubelet[23419]: E1124 09:39:40.200942   23419 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:39:40 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:39:40 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:39:40 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 456.
	Nov 24 09:39:40 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:39:40 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:39:40 functional-291288 kubelet[23507]: E1124 09:39:40.947988   23507 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:39:40 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:39:40 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-291288 -n functional-291288
helpers_test.go:262: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-291288 -n functional-291288: exit status 2 (391.995174ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:262: status error: exit status 2 (may be ok)
helpers_test.go:264: "functional-291288" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmdConnect (2.44s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim (241.65s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim
functional_test_pvc_test.go:50: (dbg) TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: waiting 4m0s for pods matching "integration-test=storage-provisioner" in namespace "kube-system" ...
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
I1124 09:38:18.317812 1654467 retry.go:31] will retry after 6.294827337s: Temporary Error: Get "http:": http: no Host in request URL
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
I1124 09:38:24.613426 1654467 retry.go:31] will retry after 4.110593158s: Temporary Error: Get "http:": http: no Host in request URL
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
I1124 09:38:28.724229 1654467 retry.go:31] will retry after 5.57931128s: Temporary Error: Get "http:": http: no Host in request URL
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
I1124 09:38:34.304381 1654467 retry.go:31] will retry after 13.721648474s: Temporary Error: Get "http:": http: no Host in request URL
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
I1124 09:38:48.026337 1654467 retry.go:31] will retry after 26.323703642s: Temporary Error: Get "http:": http: no Host in request URL
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
I1124 09:39:14.350445 1654467 retry.go:31] will retry after 24.806090911s: Temporary Error: Get "http:": http: no Host in request URL
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
E1124 09:41:03.604857 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
E1124 09:41:24.717417 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: client rate limiter Wait returned an error: context deadline exceeded
functional_test_pvc_test.go:50: ***** TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: pod "integration-test=storage-provisioner" failed to start within 4m0s: context deadline exceeded ****
functional_test_pvc_test.go:50: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-291288 -n functional-291288
functional_test_pvc_test.go:50: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-291288 -n functional-291288: exit status 2 (312.854227ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
functional_test_pvc_test.go:50: status error: exit status 2 (may be ok)
functional_test_pvc_test.go:50: "functional-291288" apiserver is not running, skipping kubectl commands (state="Stopped")
functional_test_pvc_test.go:51: failed waiting for storage-provisioner: integration-test=storage-provisioner within 4m0s: context deadline exceeded
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect functional-291288
helpers_test.go:243: (dbg) docker inspect functional-291288:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52",
	        "Created": "2025-11-24T09:10:51.896020191Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 1695240,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-11-24T09:10:51.968983407Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:572c983e466f1f784136812eef5cc59ac623db764bc7704d3676c4643993fd08",
	        "ResolvConfPath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/hostname",
	        "HostsPath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/hosts",
	        "LogPath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52-json.log",
	        "Name": "/functional-291288",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "functional-291288:/var",
	                "/lib/modules:/lib/modules:ro"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-291288",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52",
	                "LowerDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7-init/diff:/var/lib/docker/overlay2/bf294786c7ade59cb761c7bdcc27045362c3779b00a569b685443f34ade17737/diff",
	                "MergedDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7/merged",
	                "UpperDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7/diff",
	                "WorkDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "volume",
	                "Name": "functional-291288",
	                "Source": "/var/lib/docker/volumes/functional-291288/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            },
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-291288",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-291288",
	                "name.minikube.sigs.k8s.io": "functional-291288",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "09c1c2eef0dca6362dde63b4cbc372c0cfa3e4fd084b8745043d8b88925691bf",
	            "SandboxKey": "/var/run/docker/netns/09c1c2eef0dc",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34684"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34685"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34688"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34686"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34687"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-291288": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "7e:49:22:0b:f9:2c",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "1e8f91e8ad9f46b831bbb1b0589b0022d940ee9875e64a648dc80612f3ca93dc",
	                    "EndpointID": "5de5ca8ccb07584b21e6e4e30dba12e0233e8d28c3e48e705cddffe75263b337",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-291288",
	                        "70848be15fcc"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-291288 -n functional-291288
helpers_test.go:247: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-291288 -n functional-291288: exit status 2 (325.717425ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:247: status error: exit status 2 (may be ok)
helpers_test.go:252: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 logs -n 25
helpers_test.go:260: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim logs: 
-- stdout --
	
	==> Audit <==
	┌────────────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│    COMMAND     │                                                                        ARGS                                                                         │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├────────────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ ssh            │ functional-291288 ssh findmnt -T /mount-9p | grep 9p                                                                                                │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │ 24 Nov 25 09:39 UTC │
	│ ssh            │ functional-291288 ssh -- ls -la /mount-9p                                                                                                           │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │ 24 Nov 25 09:39 UTC │
	│ ssh            │ functional-291288 ssh sudo umount -f /mount-9p                                                                                                      │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	│ mount          │ -p functional-291288 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo2066364349/001:/mount1 --alsologtostderr -v=1                │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	│ mount          │ -p functional-291288 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo2066364349/001:/mount2 --alsologtostderr -v=1                │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	│ ssh            │ functional-291288 ssh findmnt -T /mount1                                                                                                            │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	│ mount          │ -p functional-291288 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo2066364349/001:/mount3 --alsologtostderr -v=1                │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	│ ssh            │ functional-291288 ssh findmnt -T /mount1                                                                                                            │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │ 24 Nov 25 09:39 UTC │
	│ ssh            │ functional-291288 ssh findmnt -T /mount2                                                                                                            │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │ 24 Nov 25 09:39 UTC │
	│ ssh            │ functional-291288 ssh findmnt -T /mount3                                                                                                            │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │ 24 Nov 25 09:39 UTC │
	│ mount          │ -p functional-291288 --kill=true                                                                                                                    │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	│ start          │ -p functional-291288 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0 │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	│ start          │ -p functional-291288 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0 │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	│ start          │ -p functional-291288 --dry-run --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0           │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	│ dashboard      │ --url --port 36195 -p functional-291288 --alsologtostderr -v=1                                                                                      │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │                     │
	│ update-context │ functional-291288 update-context --alsologtostderr -v=2                                                                                             │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │ 24 Nov 25 09:39 UTC │
	│ update-context │ functional-291288 update-context --alsologtostderr -v=2                                                                                             │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │ 24 Nov 25 09:39 UTC │
	│ update-context │ functional-291288 update-context --alsologtostderr -v=2                                                                                             │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │ 24 Nov 25 09:39 UTC │
	│ image          │ functional-291288 image ls --format short --alsologtostderr                                                                                         │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:39 UTC │ 24 Nov 25 09:40 UTC │
	│ image          │ functional-291288 image ls --format yaml --alsologtostderr                                                                                          │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:40 UTC │ 24 Nov 25 09:40 UTC │
	│ ssh            │ functional-291288 ssh pgrep buildkitd                                                                                                               │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:40 UTC │                     │
	│ image          │ functional-291288 image build -t localhost/my-image:functional-291288 testdata/build --alsologtostderr                                              │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:40 UTC │ 24 Nov 25 09:40 UTC │
	│ image          │ functional-291288 image ls                                                                                                                          │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:40 UTC │ 24 Nov 25 09:40 UTC │
	│ image          │ functional-291288 image ls --format json --alsologtostderr                                                                                          │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:40 UTC │ 24 Nov 25 09:40 UTC │
	│ image          │ functional-291288 image ls --format table --alsologtostderr                                                                                         │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:40 UTC │ 24 Nov 25 09:40 UTC │
	└────────────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/11/24 09:39:56
	Running on machine: ip-172-31-30-239
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1124 09:39:56.954354 1725830 out.go:360] Setting OutFile to fd 1 ...
	I1124 09:39:56.954575 1725830 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:39:56.954607 1725830 out.go:374] Setting ErrFile to fd 2...
	I1124 09:39:56.954629 1725830 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:39:56.954894 1725830 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
	I1124 09:39:56.955306 1725830 out.go:368] Setting JSON to false
	I1124 09:39:56.956175 1725830 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":30126,"bootTime":1763947071,"procs":159,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1124 09:39:56.956275 1725830 start.go:143] virtualization:  
	I1124 09:39:56.959619 1725830 out.go:179] * [functional-291288] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1124 09:39:56.962741 1725830 notify.go:221] Checking for updates...
	I1124 09:39:56.963243 1725830 out.go:179]   - MINIKUBE_LOCATION=21978
	I1124 09:39:56.966478 1725830 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1124 09:39:56.969366 1725830 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 09:39:56.972220 1725830 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21978-1652607/.minikube
	I1124 09:39:56.975068 1725830 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1124 09:39:56.977802 1725830 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1124 09:39:56.983230 1725830 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1124 09:39:56.983876 1725830 driver.go:422] Setting default libvirt URI to qemu:///system
	I1124 09:39:57.018751 1725830 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1124 09:39:57.018862 1725830 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 09:39:57.081141 1725830 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-11-24 09:39:57.071744274 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 09:39:57.081249 1725830 docker.go:319] overlay module found
	I1124 09:39:57.084257 1725830 out.go:179] * Using the docker driver based on existing profile
	I1124 09:39:57.086942 1725830 start.go:309] selected driver: docker
	I1124 09:39:57.086961 1725830 start.go:927] validating driver "docker" against &{Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p Mou
ntUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:39:57.087105 1725830 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1124 09:39:57.087221 1725830 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 09:39:57.145343 1725830 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-11-24 09:39:57.136089496 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 09:39:57.145801 1725830 cni.go:84] Creating CNI manager for ""
	I1124 09:39:57.145879 1725830 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1124 09:39:57.145918 1725830 start.go:353] cluster config:
	{Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local C
ontainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disab
leCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:39:57.148910 1725830 out.go:179] * dry-run validation complete!
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Nov 24 09:38:10 functional-291288 containerd[10324]: time="2025-11-24T09:38:10.028698216Z" level=info msg="ImageCreate event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:38:10 functional-291288 containerd[10324]: time="2025-11-24T09:38:10.029232770Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-291288\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:38:11 functional-291288 containerd[10324]: time="2025-11-24T09:38:11.459119625Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-291288\""
	Nov 24 09:38:11 functional-291288 containerd[10324]: time="2025-11-24T09:38:11.462708306Z" level=info msg="ImageDelete event name:\"docker.io/kicbase/echo-server:functional-291288\""
	Nov 24 09:38:11 functional-291288 containerd[10324]: time="2025-11-24T09:38:11.465197440Z" level=info msg="ImageDelete event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\""
	Nov 24 09:38:11 functional-291288 containerd[10324]: time="2025-11-24T09:38:11.482046877Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-291288\" returns successfully"
	Nov 24 09:38:11 functional-291288 containerd[10324]: time="2025-11-24T09:38:11.784805820Z" level=info msg="No images store for sha256:af1a838d2702e4e84137a83a66ae93ebb59c7bf115bf022cc84ce1a55dfd3fb4"
	Nov 24 09:38:11 functional-291288 containerd[10324]: time="2025-11-24T09:38:11.787158164Z" level=info msg="ImageCreate event name:\"docker.io/kicbase/echo-server:functional-291288\""
	Nov 24 09:38:11 functional-291288 containerd[10324]: time="2025-11-24T09:38:11.795127091Z" level=info msg="ImageCreate event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:38:11 functional-291288 containerd[10324]: time="2025-11-24T09:38:11.795603535Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-291288\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:38:12 functional-291288 containerd[10324]: time="2025-11-24T09:38:12.816798467Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-291288\""
	Nov 24 09:38:12 functional-291288 containerd[10324]: time="2025-11-24T09:38:12.819240765Z" level=info msg="ImageDelete event name:\"docker.io/kicbase/echo-server:functional-291288\""
	Nov 24 09:38:12 functional-291288 containerd[10324]: time="2025-11-24T09:38:12.822304091Z" level=info msg="ImageDelete event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\""
	Nov 24 09:38:12 functional-291288 containerd[10324]: time="2025-11-24T09:38:12.835765777Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-291288\" returns successfully"
	Nov 24 09:38:13 functional-291288 containerd[10324]: time="2025-11-24T09:38:13.649607557Z" level=info msg="No images store for sha256:80154cc39374c5be6259fccbd4295ce399d3a1d7b6e10b99200044587775c910"
	Nov 24 09:38:13 functional-291288 containerd[10324]: time="2025-11-24T09:38:13.651890157Z" level=info msg="ImageCreate event name:\"docker.io/kicbase/echo-server:functional-291288\""
	Nov 24 09:38:13 functional-291288 containerd[10324]: time="2025-11-24T09:38:13.659732716Z" level=info msg="ImageCreate event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:38:13 functional-291288 containerd[10324]: time="2025-11-24T09:38:13.660101507Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-291288\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:40:03 functional-291288 containerd[10324]: time="2025-11-24T09:40:03.611897767Z" level=info msg="connecting to shim z2wfn4pj8a6wj7emhfpm2rcwg" address="unix:///run/containerd/s/f7ca6e8f9c5eae7caf77f8d16117cbfb8da91b535240c440d4a28ed09979a204" namespace=k8s.io protocol=ttrpc version=3
	Nov 24 09:40:03 functional-291288 containerd[10324]: time="2025-11-24T09:40:03.716105406Z" level=info msg="shim disconnected" id=z2wfn4pj8a6wj7emhfpm2rcwg namespace=k8s.io
	Nov 24 09:40:03 functional-291288 containerd[10324]: time="2025-11-24T09:40:03.716156697Z" level=info msg="cleaning up after shim disconnected" id=z2wfn4pj8a6wj7emhfpm2rcwg namespace=k8s.io
	Nov 24 09:40:03 functional-291288 containerd[10324]: time="2025-11-24T09:40:03.716168430Z" level=info msg="cleaning up dead shim" id=z2wfn4pj8a6wj7emhfpm2rcwg namespace=k8s.io
	Nov 24 09:40:03 functional-291288 containerd[10324]: time="2025-11-24T09:40:03.955371899Z" level=info msg="ImageCreate event name:\"localhost/my-image:functional-291288\""
	Nov 24 09:40:03 functional-291288 containerd[10324]: time="2025-11-24T09:40:03.962751692Z" level=info msg="ImageCreate event name:\"sha256:ecd6403f78577a6f280ca2286cd284f80ed0beb4d5904124a567ceee20dd7903\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:40:03 functional-291288 containerd[10324]: time="2025-11-24T09:40:03.963336478Z" level=info msg="ImageUpdate event name:\"localhost/my-image:functional-291288\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:42:15.748289   25824 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:42:15.749127   25824 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:42:15.750876   25824 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:42:15.751185   25824 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:42:15.752710   25824 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Nov24 08:20] overlayfs: idmapped layers are currently not supported
	[Nov24 08:21] overlayfs: idmapped layers are currently not supported
	[Nov24 08:22] overlayfs: idmapped layers are currently not supported
	[Nov24 08:23] overlayfs: idmapped layers are currently not supported
	[Nov24 08:24] overlayfs: idmapped layers are currently not supported
	[ +19.566509] overlayfs: idmapped layers are currently not supported
	[Nov24 08:25] overlayfs: idmapped layers are currently not supported
	[ +35.934095] overlayfs: idmapped layers are currently not supported
	[Nov24 08:27] overlayfs: idmapped layers are currently not supported
	[Nov24 08:28] overlayfs: idmapped layers are currently not supported
	[Nov24 08:29] overlayfs: idmapped layers are currently not supported
	[Nov24 08:30] overlayfs: idmapped layers are currently not supported
	[Nov24 08:31] overlayfs: idmapped layers are currently not supported
	[ +44.505180] overlayfs: idmapped layers are currently not supported
	[Nov24 08:32] overlayfs: idmapped layers are currently not supported
	[Nov24 08:33] overlayfs: idmapped layers are currently not supported
	[Nov24 08:34] overlayfs: idmapped layers are currently not supported
	[ +21.137877] overlayfs: idmapped layers are currently not supported
	[ +28.724175] overlayfs: idmapped layers are currently not supported
	[Nov24 08:36] overlayfs: idmapped layers are currently not supported
	[Nov24 08:37] overlayfs: idmapped layers are currently not supported
	[Nov24 08:38] overlayfs: idmapped layers are currently not supported
	[  +4.063834] overlayfs: idmapped layers are currently not supported
	[Nov24 08:40] overlayfs: idmapped layers are currently not supported
	[Nov24 08:42] kauditd_printk_skb: 8 callbacks suppressed
	
	
	==> kernel <==
	 09:42:15 up  8:24,  0 user,  load average: 0.14, 0.30, 0.35
	Linux functional-291288 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Nov 24 09:42:12 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:42:13 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 659.
	Nov 24 09:42:13 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:42:13 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:42:13 functional-291288 kubelet[25689]: E1124 09:42:13.423275   25689 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:42:13 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:42:13 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:42:14 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 660.
	Nov 24 09:42:14 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:42:14 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:42:14 functional-291288 kubelet[25695]: E1124 09:42:14.175584   25695 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:42:14 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:42:14 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:42:14 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 661.
	Nov 24 09:42:14 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:42:14 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:42:14 functional-291288 kubelet[25716]: E1124 09:42:14.939296   25716 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:42:14 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:42:14 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:42:15 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 662.
	Nov 24 09:42:15 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:42:15 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:42:15 functional-291288 kubelet[25806]: E1124 09:42:15.696158   25806 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:42:15 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:42:15 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-291288 -n functional-291288
helpers_test.go:262: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-291288 -n functional-291288: exit status 2 (322.938992ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:262: status error: exit status 2 (may be ok)
helpers_test.go:264: "functional-291288" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim (241.65s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NodeLabels (3.24s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NodeLabels
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NodeLabels

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NodeLabels
functional_test.go:234: (dbg) Run:  kubectl --context functional-291288 get nodes --output=go-template "--template='{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'"
functional_test.go:234: (dbg) Non-zero exit: kubectl --context functional-291288 get nodes --output=go-template "--template='{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'": exit status 1 (96.592199ms)

                                                
                                                
-- stdout --
	'Error executing template: template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range. Printing more information for debugging the template:
		template was:
			'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'
		raw data was:
			{"apiVersion":"v1","items":[],"kind":"List","metadata":{"resourceVersion":""}}
		object given to template engine was:
			map[apiVersion:v1 items:[] kind:List metadata:map[resourceVersion:]]
	

                                                
                                                
-- /stdout --
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?
	error executing template "'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'": template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range

                                                
                                                
** /stderr **
functional_test.go:236: failed to 'kubectl get nodes' with args "kubectl --context functional-291288 get nodes --output=go-template \"--template='{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'\"": exit status 1
functional_test.go:242: expected to have label "minikube.k8s.io/commit" in node labels but got : 
-- stdout --
	'Error executing template: template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range. Printing more information for debugging the template:
		template was:
			'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'
		raw data was:
			{"apiVersion":"v1","items":[],"kind":"List","metadata":{"resourceVersion":""}}
		object given to template engine was:
			map[apiVersion:v1 items:[] kind:List metadata:map[resourceVersion:]]
	

                                                
                                                
-- /stdout --
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?
	error executing template "'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'": template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range

                                                
                                                
** /stderr **
functional_test.go:242: expected to have label "minikube.k8s.io/version" in node labels but got : 
-- stdout --
	'Error executing template: template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range. Printing more information for debugging the template:
		template was:
			'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'
		raw data was:
			{"apiVersion":"v1","items":[],"kind":"List","metadata":{"resourceVersion":""}}
		object given to template engine was:
			map[apiVersion:v1 items:[] kind:List metadata:map[resourceVersion:]]
	

                                                
                                                
-- /stdout --
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?
	error executing template "'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'": template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range

                                                
                                                
** /stderr **
functional_test.go:242: expected to have label "minikube.k8s.io/updated_at" in node labels but got : 
-- stdout --
	'Error executing template: template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range. Printing more information for debugging the template:
		template was:
			'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'
		raw data was:
			{"apiVersion":"v1","items":[],"kind":"List","metadata":{"resourceVersion":""}}
		object given to template engine was:
			map[apiVersion:v1 items:[] kind:List metadata:map[resourceVersion:]]
	

                                                
                                                
-- /stdout --
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?
	error executing template "'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'": template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range

                                                
                                                
** /stderr **
functional_test.go:242: expected to have label "minikube.k8s.io/name" in node labels but got : 
-- stdout --
	'Error executing template: template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range. Printing more information for debugging the template:
		template was:
			'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'
		raw data was:
			{"apiVersion":"v1","items":[],"kind":"List","metadata":{"resourceVersion":""}}
		object given to template engine was:
			map[apiVersion:v1 items:[] kind:List metadata:map[resourceVersion:]]
	

                                                
                                                
-- /stdout --
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?
	error executing template "'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'": template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range

                                                
                                                
** /stderr **
functional_test.go:242: expected to have label "minikube.k8s.io/primary" in node labels but got : 
-- stdout --
	'Error executing template: template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range. Printing more information for debugging the template:
		template was:
			'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'
		raw data was:
			{"apiVersion":"v1","items":[],"kind":"List","metadata":{"resourceVersion":""}}
		object given to template engine was:
			map[apiVersion:v1 items:[] kind:List metadata:map[resourceVersion:]]
	

                                                
                                                
-- /stdout --
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?
	error executing template "'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'": template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range

                                                
                                                
** /stderr **
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NodeLabels]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NodeLabels]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect functional-291288
helpers_test.go:243: (dbg) docker inspect functional-291288:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52",
	        "Created": "2025-11-24T09:10:51.896020191Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 1695240,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-11-24T09:10:51.968983407Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:572c983e466f1f784136812eef5cc59ac623db764bc7704d3676c4643993fd08",
	        "ResolvConfPath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/hostname",
	        "HostsPath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/hosts",
	        "LogPath": "/var/lib/docker/containers/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52/70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52-json.log",
	        "Name": "/functional-291288",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "functional-291288:/var",
	                "/lib/modules:/lib/modules:ro"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-291288",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "70848be15fcc9785f5f5cf706db8b0f58a4a1aeae82bef0731067623a3dd0b52",
	                "LowerDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7-init/diff:/var/lib/docker/overlay2/bf294786c7ade59cb761c7bdcc27045362c3779b00a569b685443f34ade17737/diff",
	                "MergedDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7/merged",
	                "UpperDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7/diff",
	                "WorkDir": "/var/lib/docker/overlay2/3cc6f5f8c809d502c515552ad283ef0dee330beb830a15376b0447c77fbc81b7/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "volume",
	                "Name": "functional-291288",
	                "Source": "/var/lib/docker/volumes/functional-291288/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            },
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-291288",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-291288",
	                "name.minikube.sigs.k8s.io": "functional-291288",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "09c1c2eef0dca6362dde63b4cbc372c0cfa3e4fd084b8745043d8b88925691bf",
	            "SandboxKey": "/var/run/docker/netns/09c1c2eef0dc",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34684"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34685"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34688"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34686"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34687"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-291288": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "7e:49:22:0b:f9:2c",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "1e8f91e8ad9f46b831bbb1b0589b0022d940ee9875e64a648dc80612f3ca93dc",
	                    "EndpointID": "5de5ca8ccb07584b21e6e4e30dba12e0233e8d28c3e48e705cddffe75263b337",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-291288",
	                        "70848be15fcc"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-291288 -n functional-291288
helpers_test.go:247: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-291288 -n functional-291288: exit status 2 (399.77481ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:247: status error: exit status 2 (may be ok)
helpers_test.go:252: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NodeLabels FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NodeLabels]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 logs -n 25
helpers_test.go:255: (dbg) Done: out/minikube-linux-arm64 -p functional-291288 logs -n 25: (1.406300299s)
helpers_test.go:260: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NodeLabels logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                             ARGS                                                                             │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ ssh     │ functional-291288 ssh sudo crictl images                                                                                                                     │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ ssh     │ functional-291288 ssh sudo crictl rmi registry.k8s.io/pause:latest                                                                                           │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ ssh     │ functional-291288 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                                      │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │                     │
	│ cache   │ functional-291288 cache reload                                                                                                                               │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ ssh     │ functional-291288 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                                      │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.1                                                                                                                             │ minikube          │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ cache   │ delete registry.k8s.io/pause:latest                                                                                                                          │ minikube          │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │ 24 Nov 25 09:25 UTC │
	│ kubectl │ functional-291288 kubectl -- --context functional-291288 get pods                                                                                            │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │                     │
	│ start   │ -p functional-291288 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all                                                     │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:25 UTC │                     │
	│ config  │ functional-291288 config unset cpus                                                                                                                          │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:38 UTC │ 24 Nov 25 09:38 UTC │
	│ cp      │ functional-291288 cp testdata/cp-test.txt /home/docker/cp-test.txt                                                                                           │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:38 UTC │ 24 Nov 25 09:38 UTC │
	│ config  │ functional-291288 config get cpus                                                                                                                            │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:38 UTC │                     │
	│ config  │ functional-291288 config set cpus 2                                                                                                                          │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:38 UTC │ 24 Nov 25 09:38 UTC │
	│ config  │ functional-291288 config get cpus                                                                                                                            │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:38 UTC │ 24 Nov 25 09:38 UTC │
	│ config  │ functional-291288 config unset cpus                                                                                                                          │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:38 UTC │ 24 Nov 25 09:38 UTC │
	│ ssh     │ functional-291288 ssh -n functional-291288 sudo cat /home/docker/cp-test.txt                                                                                 │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:38 UTC │ 24 Nov 25 09:38 UTC │
	│ config  │ functional-291288 config get cpus                                                                                                                            │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:38 UTC │                     │
	│ license │                                                                                                                                                              │ minikube          │ jenkins │ v1.37.0 │ 24 Nov 25 09:38 UTC │ 24 Nov 25 09:38 UTC │
	│ cp      │ functional-291288 cp functional-291288:/home/docker/cp-test.txt /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelCp4225058004/001/cp-test.txt │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:38 UTC │ 24 Nov 25 09:38 UTC │
	│ ssh     │ functional-291288 ssh sudo systemctl is-active docker                                                                                                        │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:38 UTC │                     │
	│ ssh     │ functional-291288 ssh -n functional-291288 sudo cat /home/docker/cp-test.txt                                                                                 │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:38 UTC │ 24 Nov 25 09:38 UTC │
	│ ssh     │ functional-291288 ssh sudo systemctl is-active crio                                                                                                          │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:38 UTC │                     │
	│ cp      │ functional-291288 cp testdata/cp-test.txt /tmp/does/not/exist/cp-test.txt                                                                                    │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:38 UTC │ 24 Nov 25 09:38 UTC │
	│ ssh     │ functional-291288 ssh -n functional-291288 sudo cat /tmp/does/not/exist/cp-test.txt                                                                          │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:38 UTC │ 24 Nov 25 09:38 UTC │
	│ image   │ functional-291288 image load --daemon kicbase/echo-server:functional-291288 --alsologtostderr                                                                │ functional-291288 │ jenkins │ v1.37.0 │ 24 Nov 25 09:38 UTC │                     │
	└─────────┴──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/11/24 09:25:43
	Running on machine: ip-172-31-30-239
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1124 09:25:43.956868 1707070 out.go:360] Setting OutFile to fd 1 ...
	I1124 09:25:43.957002 1707070 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:25:43.957006 1707070 out.go:374] Setting ErrFile to fd 2...
	I1124 09:25:43.957010 1707070 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:25:43.957247 1707070 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
	I1124 09:25:43.957575 1707070 out.go:368] Setting JSON to false
	I1124 09:25:43.958421 1707070 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":29273,"bootTime":1763947071,"procs":157,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1124 09:25:43.958501 1707070 start.go:143] virtualization:  
	I1124 09:25:43.961954 1707070 out.go:179] * [functional-291288] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1124 09:25:43.965745 1707070 out.go:179]   - MINIKUBE_LOCATION=21978
	I1124 09:25:43.965806 1707070 notify.go:221] Checking for updates...
	I1124 09:25:43.971831 1707070 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1124 09:25:43.974596 1707070 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 09:25:43.977531 1707070 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21978-1652607/.minikube
	I1124 09:25:43.980447 1707070 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1124 09:25:43.983266 1707070 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1124 09:25:43.986897 1707070 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1124 09:25:43.986999 1707070 driver.go:422] Setting default libvirt URI to qemu:///system
	I1124 09:25:44.009686 1707070 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1124 09:25:44.009789 1707070 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 09:25:44.075505 1707070 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-11-24 09:25:44.065719192 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 09:25:44.075607 1707070 docker.go:319] overlay module found
	I1124 09:25:44.080493 1707070 out.go:179] * Using the docker driver based on existing profile
	I1124 09:25:44.083298 1707070 start.go:309] selected driver: docker
	I1124 09:25:44.083323 1707070 start.go:927] validating driver "docker" against &{Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disa
bleCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:25:44.083409 1707070 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1124 09:25:44.083513 1707070 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 09:25:44.137525 1707070 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-11-24 09:25:44.127840235 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 09:25:44.137959 1707070 start_flags.go:992] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1124 09:25:44.137984 1707070 cni.go:84] Creating CNI manager for ""
	I1124 09:25:44.138040 1707070 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1124 09:25:44.138097 1707070 start.go:353] cluster config:
	{Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local C
ontainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disab
leCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:25:44.143064 1707070 out.go:179] * Starting "functional-291288" primary control-plane node in "functional-291288" cluster
	I1124 09:25:44.145761 1707070 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1124 09:25:44.148578 1707070 out.go:179] * Pulling base image v0.0.48-1763789673-21948 ...
	I1124 09:25:44.151418 1707070 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1124 09:25:44.151496 1707070 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f in local docker daemon
	I1124 09:25:44.171581 1707070 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f in local docker daemon, skipping pull
	I1124 09:25:44.171593 1707070 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f exists in daemon, skipping load
	W1124 09:25:44.210575 1707070 preload.go:144] https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.35.0-beta.0/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	W1124 09:25:44.425167 1707070 preload.go:144] https://github.com/kubernetes-sigs/minikube-preloads/releases/download/v18/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	I1124 09:25:44.425335 1707070 profile.go:143] Saving config to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/config.json ...
	I1124 09:25:44.425459 1707070 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:25:44.425602 1707070 cache.go:243] Successfully downloaded all kic artifacts
	I1124 09:25:44.425631 1707070 start.go:360] acquireMachinesLock for functional-291288: {Name:mk85384dc057570e1f34db593d357cea738652c4 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.425681 1707070 start.go:364] duration metric: took 28.381µs to acquireMachinesLock for "functional-291288"
	I1124 09:25:44.425694 1707070 start.go:96] Skipping create...Using existing machine configuration
	I1124 09:25:44.425698 1707070 fix.go:54] fixHost starting: 
	I1124 09:25:44.425962 1707070 cli_runner.go:164] Run: docker container inspect functional-291288 --format={{.State.Status}}
	I1124 09:25:44.443478 1707070 fix.go:112] recreateIfNeeded on functional-291288: state=Running err=<nil>
	W1124 09:25:44.443512 1707070 fix.go:138] unexpected machine state, will restart: <nil>
	I1124 09:25:44.447296 1707070 out.go:252] * Updating the running docker "functional-291288" container ...
	I1124 09:25:44.447326 1707070 machine.go:94] provisionDockerMachine start ...
	I1124 09:25:44.447405 1707070 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:25:44.465953 1707070 main.go:143] libmachine: Using SSH client type: native
	I1124 09:25:44.466284 1707070 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34684 <nil> <nil>}
	I1124 09:25:44.466291 1707070 main.go:143] libmachine: About to run SSH command:
	hostname
	I1124 09:25:44.603673 1707070 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:25:44.618572 1707070 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-291288
	
	I1124 09:25:44.618586 1707070 ubuntu.go:182] provisioning hostname "functional-291288"
	I1124 09:25:44.618668 1707070 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:25:44.659382 1707070 main.go:143] libmachine: Using SSH client type: native
	I1124 09:25:44.659732 1707070 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34684 <nil> <nil>}
	I1124 09:25:44.659741 1707070 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-291288 && echo "functional-291288" | sudo tee /etc/hostname
	I1124 09:25:44.806505 1707070 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:25:44.844189 1707070 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-291288
	
	I1124 09:25:44.844281 1707070 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:25:44.868659 1707070 main.go:143] libmachine: Using SSH client type: native
	I1124 09:25:44.869019 1707070 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34684 <nil> <nil>}
	I1124 09:25:44.869041 1707070 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-291288' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-291288/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-291288' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1124 09:25:44.979106 1707070 cache.go:107] acquiring lock: {Name:mk22a10f0ce1f3295b61e7e76c455d0494a3e278 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.979193 1707070 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 exists
	I1124 09:25:44.979201 1707070 cache.go:96] cache image "gcr.io/k8s-minikube/storage-provisioner:v5" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5" took 127.862µs
	I1124 09:25:44.979207 1707070 cache.go:80] save to tar file gcr.io/k8s-minikube/storage-provisioner:v5 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 succeeded
	I1124 09:25:44.979198 1707070 cache.go:107] acquiring lock: {Name:mk80fdbe7cdb5bc17c2a82b4ecfd00214559a435 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.979218 1707070 cache.go:107] acquiring lock: {Name:mk85f1502dbb97830776608fb729eb3605e112e6 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.979237 1707070 cache.go:107] acquiring lock: {Name:mk46ce3b59d7e062b3dbc8a90fe5b4231f256471 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.979267 1707070 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 exists
	I1124 09:25:44.979266 1707070 cache.go:107] acquiring lock: {Name:mk1cf42e67442503a46c578224bd3cb68bf682d4 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.979273 1707070 cache.go:96] cache image "registry.k8s.io/pause:3.10.1" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1" took 55.992µs
	I1124 09:25:44.979277 1707070 cache.go:80] save to tar file registry.k8s.io/pause:3.10.1 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 succeeded
	I1124 09:25:44.979285 1707070 cache.go:107] acquiring lock: {Name:mk726502cb84c177b2e14fee88512325761511c5 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.979301 1707070 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 exists
	I1124 09:25:44.979310 1707070 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 exists
	I1124 09:25:44.979308 1707070 cache.go:96] cache image "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0" took 43.274µs
	I1124 09:25:44.979314 1707070 cache.go:96] cache image "registry.k8s.io/coredns/coredns:v1.13.1" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1" took 29.982µs
	I1124 09:25:44.979319 1707070 cache.go:80] save to tar file registry.k8s.io/coredns/coredns:v1.13.1 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 succeeded
	I1124 09:25:44.979319 1707070 cache.go:80] save to tar file registry.k8s.io/kube-apiserver:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 succeeded
	I1124 09:25:44.979326 1707070 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0 exists
	I1124 09:25:44.979330 1707070 cache.go:96] cache image "registry.k8s.io/etcd:3.5.24-0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0" took 94.392µs
	I1124 09:25:44.979336 1707070 cache.go:80] save to tar file registry.k8s.io/etcd:3.5.24-0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0 succeeded
	I1124 09:25:44.979330 1707070 cache.go:107] acquiring lock: {Name:mkfdc49c8e68aee34cee0c9d441ae8a4dca675c6 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.979345 1707070 cache.go:107] acquiring lock: {Name:mkdbf38e05e2c47c1a7a906a2236e9e7020a94c5 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 09:25:44.979364 1707070 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 exists
	I1124 09:25:44.979370 1707070 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 exists
	I1124 09:25:44.979368 1707070 cache.go:96] cache image "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0" took 40.427µs
	I1124 09:25:44.979373 1707070 cache.go:96] cache image "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0" took 29.49µs
	I1124 09:25:44.979375 1707070 cache.go:80] save to tar file registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 succeeded
	I1124 09:25:44.979378 1707070 cache.go:80] save to tar file registry.k8s.io/kube-scheduler:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 succeeded
	I1124 09:25:44.979407 1707070 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 exists
	I1124 09:25:44.979413 1707070 cache.go:96] cache image "registry.k8s.io/kube-proxy:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0" took 225.709µs
	I1124 09:25:44.979418 1707070 cache.go:80] save to tar file registry.k8s.io/kube-proxy:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 succeeded
	I1124 09:25:44.979424 1707070 cache.go:87] Successfully saved all images to host disk.
	I1124 09:25:45.028668 1707070 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1124 09:25:45.028686 1707070 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/21978-1652607/.minikube CaCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/21978-1652607/.minikube}
	I1124 09:25:45.028706 1707070 ubuntu.go:190] setting up certificates
	I1124 09:25:45.028727 1707070 provision.go:84] configureAuth start
	I1124 09:25:45.028800 1707070 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-291288
	I1124 09:25:45.083635 1707070 provision.go:143] copyHostCerts
	I1124 09:25:45.083709 1707070 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem, removing ...
	I1124 09:25:45.083718 1707070 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem
	I1124 09:25:45.083806 1707070 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem (1679 bytes)
	I1124 09:25:45.083920 1707070 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem, removing ...
	I1124 09:25:45.083924 1707070 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem
	I1124 09:25:45.083951 1707070 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem (1078 bytes)
	I1124 09:25:45.084006 1707070 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem, removing ...
	I1124 09:25:45.084009 1707070 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem
	I1124 09:25:45.084038 1707070 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem (1123 bytes)
	I1124 09:25:45.084083 1707070 provision.go:117] generating server cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem org=jenkins.functional-291288 san=[127.0.0.1 192.168.49.2 functional-291288 localhost minikube]
	I1124 09:25:45.498574 1707070 provision.go:177] copyRemoteCerts
	I1124 09:25:45.498637 1707070 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1124 09:25:45.498677 1707070 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:25:45.520187 1707070 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:25:45.626724 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1124 09:25:45.644660 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1124 09:25:45.663269 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1124 09:25:45.681392 1707070 provision.go:87] duration metric: took 652.643227ms to configureAuth
	I1124 09:25:45.681410 1707070 ubuntu.go:206] setting minikube options for container-runtime
	I1124 09:25:45.681611 1707070 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1124 09:25:45.681617 1707070 machine.go:97] duration metric: took 1.234286229s to provisionDockerMachine
	I1124 09:25:45.681624 1707070 start.go:293] postStartSetup for "functional-291288" (driver="docker")
	I1124 09:25:45.681634 1707070 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1124 09:25:45.681687 1707070 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1124 09:25:45.681727 1707070 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:25:45.698790 1707070 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:25:45.802503 1707070 ssh_runner.go:195] Run: cat /etc/os-release
	I1124 09:25:45.805922 1707070 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1124 09:25:45.805944 1707070 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1124 09:25:45.805954 1707070 filesync.go:126] Scanning /home/jenkins/minikube-integration/21978-1652607/.minikube/addons for local assets ...
	I1124 09:25:45.806011 1707070 filesync.go:126] Scanning /home/jenkins/minikube-integration/21978-1652607/.minikube/files for local assets ...
	I1124 09:25:45.806087 1707070 filesync.go:149] local asset: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem -> 16544672.pem in /etc/ssl/certs
	I1124 09:25:45.806167 1707070 filesync.go:149] local asset: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/test/nested/copy/1654467/hosts -> hosts in /etc/test/nested/copy/1654467
	I1124 09:25:45.806257 1707070 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/1654467
	I1124 09:25:45.814093 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem --> /etc/ssl/certs/16544672.pem (1708 bytes)
	I1124 09:25:45.832308 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/test/nested/copy/1654467/hosts --> /etc/test/nested/copy/1654467/hosts (40 bytes)
	I1124 09:25:45.850625 1707070 start.go:296] duration metric: took 168.9873ms for postStartSetup
	I1124 09:25:45.850696 1707070 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1124 09:25:45.850734 1707070 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:25:45.868479 1707070 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:25:45.971382 1707070 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1124 09:25:45.976655 1707070 fix.go:56] duration metric: took 1.550948262s for fixHost
	I1124 09:25:45.976671 1707070 start.go:83] releasing machines lock for "functional-291288", held for 1.550982815s
	I1124 09:25:45.976739 1707070 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-291288
	I1124 09:25:45.997505 1707070 ssh_runner.go:195] Run: cat /version.json
	I1124 09:25:45.997527 1707070 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1124 09:25:45.997550 1707070 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:25:45.997588 1707070 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
	I1124 09:25:46.017321 1707070 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:25:46.018732 1707070 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
	I1124 09:25:46.118131 1707070 ssh_runner.go:195] Run: systemctl --version
	I1124 09:25:46.213854 1707070 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1124 09:25:46.218087 1707070 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1124 09:25:46.218149 1707070 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1124 09:25:46.225944 1707070 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1124 09:25:46.225958 1707070 start.go:496] detecting cgroup driver to use...
	I1124 09:25:46.225989 1707070 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1124 09:25:46.226035 1707070 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1124 09:25:46.241323 1707070 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1124 09:25:46.254720 1707070 docker.go:218] disabling cri-docker service (if available) ...
	I1124 09:25:46.254789 1707070 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1124 09:25:46.270340 1707070 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1124 09:25:46.283549 1707070 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1124 09:25:46.399926 1707070 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1124 09:25:46.515234 1707070 docker.go:234] disabling docker service ...
	I1124 09:25:46.515290 1707070 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1124 09:25:46.529899 1707070 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1124 09:25:46.543047 1707070 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1124 09:25:46.658532 1707070 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1124 09:25:46.775880 1707070 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1124 09:25:46.790551 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1124 09:25:46.806411 1707070 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:25:46.967053 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1124 09:25:46.977583 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1124 09:25:46.986552 1707070 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1124 09:25:46.986618 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1124 09:25:46.995635 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1124 09:25:47.005680 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1124 09:25:47.015425 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1124 09:25:47.024808 1707070 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1124 09:25:47.033022 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1124 09:25:47.041980 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1124 09:25:47.051362 1707070 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1124 09:25:47.060469 1707070 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1124 09:25:47.068004 1707070 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1124 09:25:47.075326 1707070 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1124 09:25:47.191217 1707070 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1124 09:25:47.313892 1707070 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1124 09:25:47.313955 1707070 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1124 09:25:47.318001 1707070 start.go:564] Will wait 60s for crictl version
	I1124 09:25:47.318060 1707070 ssh_runner.go:195] Run: which crictl
	I1124 09:25:47.321766 1707070 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1124 09:25:47.347974 1707070 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.1.5
	RuntimeApiVersion:  v1
	I1124 09:25:47.348042 1707070 ssh_runner.go:195] Run: containerd --version
	I1124 09:25:47.369074 1707070 ssh_runner.go:195] Run: containerd --version
	I1124 09:25:47.394675 1707070 out.go:179] * Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	I1124 09:25:47.397593 1707070 cli_runner.go:164] Run: docker network inspect functional-291288 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1124 09:25:47.412872 1707070 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1124 09:25:47.419437 1707070 out.go:179]   - apiserver.enable-admission-plugins=NamespaceAutoProvision
	I1124 09:25:47.422135 1707070 kubeadm.go:884] updating cluster {Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikub
eCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker Bina
ryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1124 09:25:47.422352 1707070 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:25:47.578507 1707070 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:25:47.745390 1707070 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 09:25:47.894887 1707070 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1124 09:25:47.894982 1707070 ssh_runner.go:195] Run: sudo crictl images --output json
	I1124 09:25:47.919585 1707070 containerd.go:627] all images are preloaded for containerd runtime.
	I1124 09:25:47.919604 1707070 cache_images.go:86] Images are preloaded, skipping loading
	I1124 09:25:47.919612 1707070 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-beta.0 containerd true true} ...
	I1124 09:25:47.919707 1707070 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-291288 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1124 09:25:47.919778 1707070 ssh_runner.go:195] Run: sudo crictl info
	I1124 09:25:47.948265 1707070 extraconfig.go:125] Overwriting default enable-admission-plugins=NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota with user provided enable-admission-plugins=NamespaceAutoProvision for component apiserver
	I1124 09:25:47.948285 1707070 cni.go:84] Creating CNI manager for ""
	I1124 09:25:47.948293 1707070 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1124 09:25:47.948308 1707070 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1124 09:25:47.948331 1707070 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-beta.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-291288 NodeName:functional-291288 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceAutoProvision] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false Kubel
etConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1124 09:25:47.948441 1707070 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-291288"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceAutoProvision"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-beta.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1124 09:25:47.948507 1707070 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0
	I1124 09:25:47.956183 1707070 binaries.go:51] Found k8s binaries, skipping transfer
	I1124 09:25:47.956246 1707070 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1124 09:25:47.963641 1707070 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (328 bytes)
	I1124 09:25:47.976586 1707070 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (359 bytes)
	I1124 09:25:47.989056 1707070 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2087 bytes)
	I1124 09:25:48.003961 1707070 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1124 09:25:48.011533 1707070 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1124 09:25:48.134407 1707070 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1124 09:25:48.383061 1707070 certs.go:69] Setting up /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288 for IP: 192.168.49.2
	I1124 09:25:48.383072 1707070 certs.go:195] generating shared ca certs ...
	I1124 09:25:48.383086 1707070 certs.go:227] acquiring lock for ca certs: {Name:mkbe540a30c4376a351176f7fe6fec044d058b09 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 09:25:48.383238 1707070 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.key
	I1124 09:25:48.383279 1707070 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.key
	I1124 09:25:48.383286 1707070 certs.go:257] generating profile certs ...
	I1124 09:25:48.383366 1707070 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.key
	I1124 09:25:48.383420 1707070 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.key.5acb2515
	I1124 09:25:48.383456 1707070 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.key
	I1124 09:25:48.383562 1707070 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467.pem (1338 bytes)
	W1124 09:25:48.383598 1707070 certs.go:480] ignoring /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467_empty.pem, impossibly tiny 0 bytes
	I1124 09:25:48.383605 1707070 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem (1671 bytes)
	I1124 09:25:48.383632 1707070 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem (1078 bytes)
	I1124 09:25:48.383655 1707070 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem (1123 bytes)
	I1124 09:25:48.383684 1707070 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem (1679 bytes)
	I1124 09:25:48.383730 1707070 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem (1708 bytes)
	I1124 09:25:48.384294 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1124 09:25:48.403533 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1124 09:25:48.421212 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1124 09:25:48.441887 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1124 09:25:48.462311 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1124 09:25:48.480889 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1124 09:25:48.499086 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1124 09:25:48.517112 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1124 09:25:48.535554 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem --> /usr/share/ca-certificates/16544672.pem (1708 bytes)
	I1124 09:25:48.553310 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1124 09:25:48.571447 1707070 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467.pem --> /usr/share/ca-certificates/1654467.pem (1338 bytes)
	I1124 09:25:48.589094 1707070 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1124 09:25:48.602393 1707070 ssh_runner.go:195] Run: openssl version
	I1124 09:25:48.608953 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/16544672.pem && ln -fs /usr/share/ca-certificates/16544672.pem /etc/ssl/certs/16544672.pem"
	I1124 09:25:48.617886 1707070 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/16544672.pem
	I1124 09:25:48.621697 1707070 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Nov 24 09:10 /usr/share/ca-certificates/16544672.pem
	I1124 09:25:48.621756 1707070 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/16544672.pem
	I1124 09:25:48.663214 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/16544672.pem /etc/ssl/certs/3ec20f2e.0"
	I1124 09:25:48.671328 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1124 09:25:48.679977 1707070 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:25:48.683961 1707070 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Nov 24 08:44 /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:25:48.684024 1707070 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1124 09:25:48.725273 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1124 09:25:48.733278 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1654467.pem && ln -fs /usr/share/ca-certificates/1654467.pem /etc/ssl/certs/1654467.pem"
	I1124 09:25:48.741887 1707070 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1654467.pem
	I1124 09:25:48.745440 1707070 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Nov 24 09:10 /usr/share/ca-certificates/1654467.pem
	I1124 09:25:48.745500 1707070 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1654467.pem
	I1124 09:25:48.791338 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1654467.pem /etc/ssl/certs/51391683.0"
	I1124 09:25:48.799503 1707070 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1124 09:25:48.803145 1707070 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1124 09:25:48.844016 1707070 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1124 09:25:48.884962 1707070 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1124 09:25:48.926044 1707070 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1124 09:25:48.967289 1707070 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1124 09:25:49.008697 1707070 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1124 09:25:49.049934 1707070 kubeadm.go:401] StartCluster: {Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIServerName:minikubeCA
APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryM
irror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:25:49.050012 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1124 09:25:49.050074 1707070 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1124 09:25:49.080420 1707070 cri.go:89] found id: ""
	I1124 09:25:49.080484 1707070 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1124 09:25:49.088364 1707070 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1124 09:25:49.088374 1707070 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1124 09:25:49.088425 1707070 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1124 09:25:49.095680 1707070 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1124 09:25:49.096194 1707070 kubeconfig.go:125] found "functional-291288" server: "https://192.168.49.2:8441"
	I1124 09:25:49.097500 1707070 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1124 09:25:49.105267 1707070 kubeadm.go:645] detected kubeadm config drift (will reconfigure cluster from new /var/tmp/minikube/kubeadm.yaml):
	-- stdout --
	--- /var/tmp/minikube/kubeadm.yaml	2025-11-24 09:11:10.138797725 +0000
	+++ /var/tmp/minikube/kubeadm.yaml.new	2025-11-24 09:25:47.995648074 +0000
	@@ -24,7 +24,7 @@
	   certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	   extraArgs:
	     - name: "enable-admission-plugins"
	-      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	+      value: "NamespaceAutoProvision"
	 controllerManager:
	   extraArgs:
	     - name: "allocate-node-cidrs"
	
	-- /stdout --
	I1124 09:25:49.105285 1707070 kubeadm.go:1161] stopping kube-system containers ...
	I1124 09:25:49.105296 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name: Namespaces:[kube-system]}
	I1124 09:25:49.105351 1707070 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1124 09:25:49.142256 1707070 cri.go:89] found id: ""
	I1124 09:25:49.142317 1707070 ssh_runner.go:195] Run: sudo systemctl stop kubelet
	I1124 09:25:49.162851 1707070 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1124 09:25:49.170804 1707070 kubeadm.go:158] found existing configuration files:
	-rw------- 1 root root 5635 Nov 24 09:15 /etc/kubernetes/admin.conf
	-rw------- 1 root root 5636 Nov 24 09:15 /etc/kubernetes/controller-manager.conf
	-rw------- 1 root root 5672 Nov 24 09:15 /etc/kubernetes/kubelet.conf
	-rw------- 1 root root 5584 Nov 24 09:15 /etc/kubernetes/scheduler.conf
	
	I1124 09:25:49.170876 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1124 09:25:49.178603 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1124 09:25:49.185907 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1124 09:25:49.185964 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1124 09:25:49.193453 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1124 09:25:49.200815 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1124 09:25:49.200869 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1124 09:25:49.208328 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1124 09:25:49.215968 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1124 09:25:49.216025 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1124 09:25:49.223400 1707070 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1124 09:25:49.230953 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase certs all --config /var/tmp/minikube/kubeadm.yaml"
	I1124 09:25:49.277779 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml"
	I1124 09:25:50.308934 1707070 ssh_runner.go:235] Completed: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml": (1.031131442s)
	I1124 09:25:50.308993 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubelet-start --config /var/tmp/minikube/kubeadm.yaml"
	I1124 09:25:50.511648 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase control-plane all --config /var/tmp/minikube/kubeadm.yaml"
	I1124 09:25:50.576653 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase etcd local --config /var/tmp/minikube/kubeadm.yaml"
	I1124 09:25:50.625775 1707070 api_server.go:52] waiting for apiserver process to appear ...
	I1124 09:25:50.625855 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:51.126713 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:51.625939 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:52.126677 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:52.626053 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:53.126113 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:53.626972 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:54.126493 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:54.626036 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:55.126171 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:55.626853 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:56.126041 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:56.626177 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:57.126019 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:57.626847 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:58.126017 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:58.626716 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:59.125997 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:25:59.626367 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:00.125951 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:00.626013 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:01.126844 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:01.626038 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:02.126420 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:02.626727 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:03.126582 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:03.626068 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:04.126304 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:04.626830 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:05.126754 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:05.625961 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:06.126197 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:06.626039 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:07.126915 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:07.626052 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:08.126281 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:08.626116 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:09.126574 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:09.626068 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:10.125978 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:10.626328 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:11.126416 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:11.626073 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:12.126027 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:12.626174 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:13.126044 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:13.626781 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:14.126849 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:14.626203 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:15.125957 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:15.626068 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:16.126934 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:16.626382 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:17.126245 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:17.626034 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:18.126745 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:18.626942 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:19.126393 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:19.626607 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:20.126050 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:20.626732 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:21.126049 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:21.626115 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:22.125988 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:22.626261 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:23.126293 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:23.626107 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:24.126971 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:24.626009 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:25.126859 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:25.626876 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:26.126041 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:26.625983 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:27.126168 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:27.626079 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:28.126047 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:28.626761 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:29.126598 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:29.626290 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:30.125941 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:30.626102 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:31.126717 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:31.626588 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:32.126223 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:32.626875 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:33.126051 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:33.625963 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:34.126808 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:34.626621 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:35.126147 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:35.626018 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:36.126039 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:36.625970 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:37.126579 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:37.626198 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:38.126718 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:38.626386 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:39.126159 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:39.626590 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:40.126050 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:40.626422 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:41.126600 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:41.626097 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:42.127732 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:42.626108 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:43.126855 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:43.626202 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:44.126380 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:44.626423 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:45.127019 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:45.626257 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:46.125911 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:46.626125 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:47.126026 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:47.626915 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:48.126322 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:48.626706 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:49.126864 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:49.627009 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:50.126375 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:50.626418 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:26:50.626521 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:26:50.654529 1707070 cri.go:89] found id: ""
	I1124 09:26:50.654543 1707070 logs.go:282] 0 containers: []
	W1124 09:26:50.654550 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:26:50.654555 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:26:50.654624 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:26:50.683038 1707070 cri.go:89] found id: ""
	I1124 09:26:50.683052 1707070 logs.go:282] 0 containers: []
	W1124 09:26:50.683059 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:26:50.683064 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:26:50.683121 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:26:50.711396 1707070 cri.go:89] found id: ""
	I1124 09:26:50.711410 1707070 logs.go:282] 0 containers: []
	W1124 09:26:50.711422 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:26:50.711433 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:26:50.711498 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:26:50.735435 1707070 cri.go:89] found id: ""
	I1124 09:26:50.735449 1707070 logs.go:282] 0 containers: []
	W1124 09:26:50.735457 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:26:50.735463 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:26:50.735520 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:26:50.760437 1707070 cri.go:89] found id: ""
	I1124 09:26:50.760451 1707070 logs.go:282] 0 containers: []
	W1124 09:26:50.760458 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:26:50.760464 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:26:50.760520 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:26:50.785555 1707070 cri.go:89] found id: ""
	I1124 09:26:50.785576 1707070 logs.go:282] 0 containers: []
	W1124 09:26:50.785584 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:26:50.785590 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:26:50.785662 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:26:50.810261 1707070 cri.go:89] found id: ""
	I1124 09:26:50.810278 1707070 logs.go:282] 0 containers: []
	W1124 09:26:50.810286 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:26:50.810294 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:26:50.810305 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:26:50.879322 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:26:50.870488   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:50.871030   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:50.872890   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:50.873352   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:50.875005   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:26:50.870488   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:50.871030   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:50.872890   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:50.873352   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:50.875005   11382 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:26:50.879334 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:26:50.879345 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:26:50.941117 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:26:50.941140 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:26:50.969259 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:26:50.969275 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:26:51.024741 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:26:51.024763 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:26:53.542977 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:53.553083 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:26:53.553155 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:26:53.577781 1707070 cri.go:89] found id: ""
	I1124 09:26:53.577795 1707070 logs.go:282] 0 containers: []
	W1124 09:26:53.577802 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:26:53.577808 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:26:53.577866 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:26:53.604191 1707070 cri.go:89] found id: ""
	I1124 09:26:53.604205 1707070 logs.go:282] 0 containers: []
	W1124 09:26:53.604212 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:26:53.604217 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:26:53.604277 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:26:53.632984 1707070 cri.go:89] found id: ""
	I1124 09:26:53.632998 1707070 logs.go:282] 0 containers: []
	W1124 09:26:53.633004 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:26:53.633010 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:26:53.633071 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:26:53.663828 1707070 cri.go:89] found id: ""
	I1124 09:26:53.663842 1707070 logs.go:282] 0 containers: []
	W1124 09:26:53.663850 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:26:53.663856 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:26:53.663912 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:26:53.695173 1707070 cri.go:89] found id: ""
	I1124 09:26:53.695187 1707070 logs.go:282] 0 containers: []
	W1124 09:26:53.695195 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:26:53.695200 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:26:53.695259 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:26:53.719882 1707070 cri.go:89] found id: ""
	I1124 09:26:53.719897 1707070 logs.go:282] 0 containers: []
	W1124 09:26:53.719904 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:26:53.719910 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:26:53.719993 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:26:53.753006 1707070 cri.go:89] found id: ""
	I1124 09:26:53.753020 1707070 logs.go:282] 0 containers: []
	W1124 09:26:53.753038 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:26:53.753046 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:26:53.753057 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:26:53.810839 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:26:53.810864 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:26:53.828132 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:26:53.828149 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:26:53.893802 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:26:53.885327   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:53.886130   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:53.888016   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:53.888539   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:53.890056   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:26:53.885327   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:53.886130   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:53.888016   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:53.888539   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:53.890056   11494 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:26:53.893815 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:26:53.893825 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:26:53.955840 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:26:53.955860 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:26:56.485625 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:56.495752 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:26:56.495812 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:26:56.523600 1707070 cri.go:89] found id: ""
	I1124 09:26:56.523614 1707070 logs.go:282] 0 containers: []
	W1124 09:26:56.523622 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:26:56.523627 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:26:56.523730 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:26:56.547432 1707070 cri.go:89] found id: ""
	I1124 09:26:56.547445 1707070 logs.go:282] 0 containers: []
	W1124 09:26:56.547453 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:26:56.547465 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:26:56.547522 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:26:56.571895 1707070 cri.go:89] found id: ""
	I1124 09:26:56.571909 1707070 logs.go:282] 0 containers: []
	W1124 09:26:56.571917 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:26:56.571922 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:26:56.571977 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:26:56.596624 1707070 cri.go:89] found id: ""
	I1124 09:26:56.596637 1707070 logs.go:282] 0 containers: []
	W1124 09:26:56.596644 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:26:56.596650 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:26:56.596705 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:26:56.621497 1707070 cri.go:89] found id: ""
	I1124 09:26:56.621511 1707070 logs.go:282] 0 containers: []
	W1124 09:26:56.621518 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:26:56.621523 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:26:56.621588 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:26:56.656808 1707070 cri.go:89] found id: ""
	I1124 09:26:56.656822 1707070 logs.go:282] 0 containers: []
	W1124 09:26:56.656829 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:26:56.656834 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:26:56.656891 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:26:56.693750 1707070 cri.go:89] found id: ""
	I1124 09:26:56.693763 1707070 logs.go:282] 0 containers: []
	W1124 09:26:56.693770 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:26:56.693778 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:26:56.693799 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:26:56.711624 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:26:56.711642 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:26:56.772006 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:26:56.764543   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:56.764946   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:56.766216   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:56.766780   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:56.768382   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:26:56.764543   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:56.764946   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:56.766216   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:56.766780   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:56.768382   11595 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:26:56.772020 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:26:56.772030 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:26:56.832784 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:26:56.832805 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:26:56.862164 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:26:56.862179 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:26:59.417328 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:26:59.427445 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:26:59.427506 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:26:59.451539 1707070 cri.go:89] found id: ""
	I1124 09:26:59.451574 1707070 logs.go:282] 0 containers: []
	W1124 09:26:59.451582 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:26:59.451588 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:26:59.451647 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:26:59.476110 1707070 cri.go:89] found id: ""
	I1124 09:26:59.476124 1707070 logs.go:282] 0 containers: []
	W1124 09:26:59.476131 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:26:59.476137 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:26:59.476194 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:26:59.504520 1707070 cri.go:89] found id: ""
	I1124 09:26:59.504533 1707070 logs.go:282] 0 containers: []
	W1124 09:26:59.504540 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:26:59.504546 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:26:59.504607 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:26:59.529647 1707070 cri.go:89] found id: ""
	I1124 09:26:59.529662 1707070 logs.go:282] 0 containers: []
	W1124 09:26:59.529669 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:26:59.529674 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:26:59.529753 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:26:59.558904 1707070 cri.go:89] found id: ""
	I1124 09:26:59.558918 1707070 logs.go:282] 0 containers: []
	W1124 09:26:59.558925 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:26:59.558930 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:26:59.558999 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:26:59.583698 1707070 cri.go:89] found id: ""
	I1124 09:26:59.583712 1707070 logs.go:282] 0 containers: []
	W1124 09:26:59.583733 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:26:59.583738 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:26:59.583800 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:26:59.607605 1707070 cri.go:89] found id: ""
	I1124 09:26:59.607619 1707070 logs.go:282] 0 containers: []
	W1124 09:26:59.607626 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:26:59.607634 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:26:59.607645 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:26:59.624446 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:26:59.624462 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:26:59.711588 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:26:59.701837   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:59.703242   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:59.704228   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:59.706009   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:59.706513   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:26:59.701837   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:59.703242   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:59.704228   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:59.706009   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:26:59.706513   11699 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:26:59.711600 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:26:59.711610 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:26:59.777617 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:26:59.777638 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:26:59.810868 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:26:59.810888 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:02.368395 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:02.379444 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:02.379503 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:02.403995 1707070 cri.go:89] found id: ""
	I1124 09:27:02.404009 1707070 logs.go:282] 0 containers: []
	W1124 09:27:02.404017 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:02.404022 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:02.404080 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:02.428532 1707070 cri.go:89] found id: ""
	I1124 09:27:02.428546 1707070 logs.go:282] 0 containers: []
	W1124 09:27:02.428553 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:02.428559 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:02.428623 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:02.455148 1707070 cri.go:89] found id: ""
	I1124 09:27:02.455162 1707070 logs.go:282] 0 containers: []
	W1124 09:27:02.455169 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:02.455174 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:02.455233 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:02.479942 1707070 cri.go:89] found id: ""
	I1124 09:27:02.479957 1707070 logs.go:282] 0 containers: []
	W1124 09:27:02.479969 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:02.479975 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:02.480034 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:02.505728 1707070 cri.go:89] found id: ""
	I1124 09:27:02.505744 1707070 logs.go:282] 0 containers: []
	W1124 09:27:02.505751 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:02.505760 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:02.505845 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:02.536863 1707070 cri.go:89] found id: ""
	I1124 09:27:02.536881 1707070 logs.go:282] 0 containers: []
	W1124 09:27:02.536889 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:02.536894 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:02.536960 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:02.566083 1707070 cri.go:89] found id: ""
	I1124 09:27:02.566107 1707070 logs.go:282] 0 containers: []
	W1124 09:27:02.566124 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:02.566132 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:02.566142 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:02.628402 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:02.628423 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:02.669505 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:02.669523 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:02.737879 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:02.737907 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:02.755317 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:02.755334 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:02.820465 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:02.811248   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:02.812608   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:02.813513   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:02.815318   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:02.815727   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:02.811248   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:02.812608   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:02.813513   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:02.815318   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:02.815727   11818 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:05.320749 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:05.331020 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:05.331081 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:05.355889 1707070 cri.go:89] found id: ""
	I1124 09:27:05.355904 1707070 logs.go:282] 0 containers: []
	W1124 09:27:05.355912 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:05.355917 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:05.355980 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:05.381650 1707070 cri.go:89] found id: ""
	I1124 09:27:05.381664 1707070 logs.go:282] 0 containers: []
	W1124 09:27:05.381671 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:05.381676 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:05.381733 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:05.410311 1707070 cri.go:89] found id: ""
	I1124 09:27:05.410325 1707070 logs.go:282] 0 containers: []
	W1124 09:27:05.410332 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:05.410337 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:05.410396 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:05.434601 1707070 cri.go:89] found id: ""
	I1124 09:27:05.434615 1707070 logs.go:282] 0 containers: []
	W1124 09:27:05.434621 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:05.434627 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:05.434684 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:05.459196 1707070 cri.go:89] found id: ""
	I1124 09:27:05.459210 1707070 logs.go:282] 0 containers: []
	W1124 09:27:05.459218 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:05.459223 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:05.459294 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:05.483433 1707070 cri.go:89] found id: ""
	I1124 09:27:05.483448 1707070 logs.go:282] 0 containers: []
	W1124 09:27:05.483455 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:05.483460 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:05.483523 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:05.508072 1707070 cri.go:89] found id: ""
	I1124 09:27:05.508086 1707070 logs.go:282] 0 containers: []
	W1124 09:27:05.508093 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:05.508101 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:05.508111 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:05.563733 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:05.563752 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:05.584705 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:05.584736 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:05.666380 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:05.657873   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:05.658740   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:05.660432   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:05.660828   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:05.662363   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:05.657873   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:05.658740   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:05.660432   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:05.660828   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:05.662363   11904 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:05.666394 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:05.666405 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:05.738526 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:05.738548 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:08.268404 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:08.278347 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:08.278408 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:08.303562 1707070 cri.go:89] found id: ""
	I1124 09:27:08.303577 1707070 logs.go:282] 0 containers: []
	W1124 09:27:08.303585 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:08.303590 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:08.303651 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:08.329886 1707070 cri.go:89] found id: ""
	I1124 09:27:08.329900 1707070 logs.go:282] 0 containers: []
	W1124 09:27:08.329907 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:08.329913 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:08.329971 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:08.355081 1707070 cri.go:89] found id: ""
	I1124 09:27:08.355096 1707070 logs.go:282] 0 containers: []
	W1124 09:27:08.355104 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:08.355110 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:08.355175 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:08.381511 1707070 cri.go:89] found id: ""
	I1124 09:27:08.381534 1707070 logs.go:282] 0 containers: []
	W1124 09:27:08.381543 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:08.381549 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:08.381620 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:08.410606 1707070 cri.go:89] found id: ""
	I1124 09:27:08.410629 1707070 logs.go:282] 0 containers: []
	W1124 09:27:08.410637 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:08.410642 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:08.410700 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:08.434980 1707070 cri.go:89] found id: ""
	I1124 09:27:08.434994 1707070 logs.go:282] 0 containers: []
	W1124 09:27:08.435001 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:08.435007 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:08.435064 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:08.463780 1707070 cri.go:89] found id: ""
	I1124 09:27:08.463793 1707070 logs.go:282] 0 containers: []
	W1124 09:27:08.463800 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:08.463808 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:08.463819 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:08.527201 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:08.518614   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:08.519320   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:08.521220   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:08.521832   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:08.523649   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:08.518614   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:08.519320   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:08.521220   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:08.521832   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:08.523649   12003 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:08.527213 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:08.527223 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:08.591559 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:08.591581 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:08.619107 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:08.619125 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:08.678658 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:08.678675 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:11.199028 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:11.209463 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:11.209529 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:11.236040 1707070 cri.go:89] found id: ""
	I1124 09:27:11.236061 1707070 logs.go:282] 0 containers: []
	W1124 09:27:11.236069 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:11.236075 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:11.236145 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:11.263895 1707070 cri.go:89] found id: ""
	I1124 09:27:11.263906 1707070 logs.go:282] 0 containers: []
	W1124 09:27:11.263912 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:11.263917 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:11.263968 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:11.290492 1707070 cri.go:89] found id: ""
	I1124 09:27:11.290507 1707070 logs.go:282] 0 containers: []
	W1124 09:27:11.290514 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:11.290519 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:11.290575 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:11.316763 1707070 cri.go:89] found id: ""
	I1124 09:27:11.316778 1707070 logs.go:282] 0 containers: []
	W1124 09:27:11.316785 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:11.316791 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:11.316899 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:11.340653 1707070 cri.go:89] found id: ""
	I1124 09:27:11.340668 1707070 logs.go:282] 0 containers: []
	W1124 09:27:11.340675 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:11.340680 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:11.340741 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:11.365000 1707070 cri.go:89] found id: ""
	I1124 09:27:11.365013 1707070 logs.go:282] 0 containers: []
	W1124 09:27:11.365020 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:11.365026 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:11.365086 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:11.393012 1707070 cri.go:89] found id: ""
	I1124 09:27:11.393025 1707070 logs.go:282] 0 containers: []
	W1124 09:27:11.393033 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:11.393041 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:11.393053 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:11.409740 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:11.409758 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:11.474068 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:11.465242   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:11.466095   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:11.467959   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:11.468588   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:11.470448   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:11.465242   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:11.466095   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:11.467959   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:11.468588   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:11.470448   12112 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:11.474079 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:11.474089 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:11.535411 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:11.535433 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:11.565626 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:11.565645 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:14.123823 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:14.133770 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:14.133829 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:14.157476 1707070 cri.go:89] found id: ""
	I1124 09:27:14.157490 1707070 logs.go:282] 0 containers: []
	W1124 09:27:14.157497 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:14.157503 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:14.157562 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:14.188747 1707070 cri.go:89] found id: ""
	I1124 09:27:14.188761 1707070 logs.go:282] 0 containers: []
	W1124 09:27:14.188768 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:14.188773 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:14.188830 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:14.216257 1707070 cri.go:89] found id: ""
	I1124 09:27:14.216271 1707070 logs.go:282] 0 containers: []
	W1124 09:27:14.216279 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:14.216284 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:14.216345 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:14.241336 1707070 cri.go:89] found id: ""
	I1124 09:27:14.241349 1707070 logs.go:282] 0 containers: []
	W1124 09:27:14.241357 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:14.241362 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:14.241423 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:14.265223 1707070 cri.go:89] found id: ""
	I1124 09:27:14.265238 1707070 logs.go:282] 0 containers: []
	W1124 09:27:14.265245 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:14.265250 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:14.265312 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:14.292087 1707070 cri.go:89] found id: ""
	I1124 09:27:14.292101 1707070 logs.go:282] 0 containers: []
	W1124 09:27:14.292108 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:14.292114 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:14.292171 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:14.316839 1707070 cri.go:89] found id: ""
	I1124 09:27:14.316854 1707070 logs.go:282] 0 containers: []
	W1124 09:27:14.316861 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:14.316869 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:14.316879 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:14.371692 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:14.371715 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:14.388964 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:14.388980 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:14.455069 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:14.447375   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:14.448018   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:14.449517   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:14.449819   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:14.451683   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:14.447375   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:14.448018   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:14.449517   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:14.449819   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:14.451683   12220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:14.455080 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:14.455090 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:14.518102 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:14.518124 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:17.045537 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:17.055937 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:17.056004 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:17.084357 1707070 cri.go:89] found id: ""
	I1124 09:27:17.084370 1707070 logs.go:282] 0 containers: []
	W1124 09:27:17.084378 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:17.084383 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:17.084439 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:17.112022 1707070 cri.go:89] found id: ""
	I1124 09:27:17.112035 1707070 logs.go:282] 0 containers: []
	W1124 09:27:17.112043 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:17.112048 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:17.112110 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:17.135317 1707070 cri.go:89] found id: ""
	I1124 09:27:17.135331 1707070 logs.go:282] 0 containers: []
	W1124 09:27:17.135338 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:17.135343 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:17.135399 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:17.163850 1707070 cri.go:89] found id: ""
	I1124 09:27:17.163865 1707070 logs.go:282] 0 containers: []
	W1124 09:27:17.163872 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:17.163878 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:17.163933 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:17.188915 1707070 cri.go:89] found id: ""
	I1124 09:27:17.188929 1707070 logs.go:282] 0 containers: []
	W1124 09:27:17.188936 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:17.188941 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:17.188997 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:17.217448 1707070 cri.go:89] found id: ""
	I1124 09:27:17.217461 1707070 logs.go:282] 0 containers: []
	W1124 09:27:17.217475 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:17.217480 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:17.217537 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:17.242521 1707070 cri.go:89] found id: ""
	I1124 09:27:17.242536 1707070 logs.go:282] 0 containers: []
	W1124 09:27:17.242543 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:17.242551 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:17.242561 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:17.297899 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:17.297921 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:17.315278 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:17.315297 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:17.377620 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:17.368489   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:17.368893   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:17.370596   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:17.371050   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:17.372486   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:17.368489   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:17.368893   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:17.370596   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:17.371050   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:17.372486   12324 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:17.377640 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:17.377651 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:17.439884 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:17.439907 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:19.969337 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:19.979536 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:19.979595 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:20.018198 1707070 cri.go:89] found id: ""
	I1124 09:27:20.018220 1707070 logs.go:282] 0 containers: []
	W1124 09:27:20.018229 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:20.018235 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:20.018297 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:20.046055 1707070 cri.go:89] found id: ""
	I1124 09:27:20.046070 1707070 logs.go:282] 0 containers: []
	W1124 09:27:20.046077 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:20.046082 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:20.046158 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:20.078159 1707070 cri.go:89] found id: ""
	I1124 09:27:20.078183 1707070 logs.go:282] 0 containers: []
	W1124 09:27:20.078191 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:20.078197 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:20.078289 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:20.104136 1707070 cri.go:89] found id: ""
	I1124 09:27:20.104151 1707070 logs.go:282] 0 containers: []
	W1124 09:27:20.104158 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:20.104164 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:20.104228 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:20.130266 1707070 cri.go:89] found id: ""
	I1124 09:27:20.130280 1707070 logs.go:282] 0 containers: []
	W1124 09:27:20.130288 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:20.130293 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:20.130352 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:20.156899 1707070 cri.go:89] found id: ""
	I1124 09:27:20.156913 1707070 logs.go:282] 0 containers: []
	W1124 09:27:20.156921 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:20.156926 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:20.156986 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:20.182706 1707070 cri.go:89] found id: ""
	I1124 09:27:20.182721 1707070 logs.go:282] 0 containers: []
	W1124 09:27:20.182728 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:20.182736 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:20.182747 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:20.240720 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:20.240740 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:20.257971 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:20.257987 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:20.324806 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:20.316231   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:20.316929   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:20.317881   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:20.319464   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:20.319918   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:20.316231   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:20.316929   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:20.317881   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:20.319464   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:20.319918   12432 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:20.324827 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:20.324838 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:20.386188 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:20.386212 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:22.915679 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:22.927190 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:22.927254 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:22.959235 1707070 cri.go:89] found id: ""
	I1124 09:27:22.959249 1707070 logs.go:282] 0 containers: []
	W1124 09:27:22.959256 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:22.959262 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:22.959318 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:22.986124 1707070 cri.go:89] found id: ""
	I1124 09:27:22.986138 1707070 logs.go:282] 0 containers: []
	W1124 09:27:22.986146 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:22.986151 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:22.986206 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:23.016094 1707070 cri.go:89] found id: ""
	I1124 09:27:23.016108 1707070 logs.go:282] 0 containers: []
	W1124 09:27:23.016116 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:23.016121 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:23.016183 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:23.044417 1707070 cri.go:89] found id: ""
	I1124 09:27:23.044431 1707070 logs.go:282] 0 containers: []
	W1124 09:27:23.044439 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:23.044444 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:23.044501 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:23.069468 1707070 cri.go:89] found id: ""
	I1124 09:27:23.069484 1707070 logs.go:282] 0 containers: []
	W1124 09:27:23.069491 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:23.069497 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:23.069556 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:23.096521 1707070 cri.go:89] found id: ""
	I1124 09:27:23.096535 1707070 logs.go:282] 0 containers: []
	W1124 09:27:23.096542 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:23.096548 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:23.096605 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:23.125327 1707070 cri.go:89] found id: ""
	I1124 09:27:23.125342 1707070 logs.go:282] 0 containers: []
	W1124 09:27:23.125349 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:23.125358 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:23.125367 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:23.180584 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:23.180605 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:23.197372 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:23.197388 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:23.259943 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:23.251679   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:23.252410   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:23.253306   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:23.254866   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:23.255334   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:23.251679   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:23.252410   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:23.253306   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:23.254866   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:23.255334   12538 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:23.259953 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:23.259965 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:23.325045 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:23.325066 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:25.855733 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:25.866329 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:25.866395 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:25.906494 1707070 cri.go:89] found id: ""
	I1124 09:27:25.906508 1707070 logs.go:282] 0 containers: []
	W1124 09:27:25.906516 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:25.906521 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:25.906590 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:25.945205 1707070 cri.go:89] found id: ""
	I1124 09:27:25.945229 1707070 logs.go:282] 0 containers: []
	W1124 09:27:25.945237 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:25.945242 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:25.945301 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:25.970721 1707070 cri.go:89] found id: ""
	I1124 09:27:25.970736 1707070 logs.go:282] 0 containers: []
	W1124 09:27:25.970743 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:25.970749 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:25.970807 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:25.997334 1707070 cri.go:89] found id: ""
	I1124 09:27:25.997348 1707070 logs.go:282] 0 containers: []
	W1124 09:27:25.997355 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:25.997364 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:25.997438 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:26.029916 1707070 cri.go:89] found id: ""
	I1124 09:27:26.029932 1707070 logs.go:282] 0 containers: []
	W1124 09:27:26.029940 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:26.029945 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:26.030007 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:26.057466 1707070 cri.go:89] found id: ""
	I1124 09:27:26.057480 1707070 logs.go:282] 0 containers: []
	W1124 09:27:26.057488 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:26.057494 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:26.057565 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:26.083489 1707070 cri.go:89] found id: ""
	I1124 09:27:26.083503 1707070 logs.go:282] 0 containers: []
	W1124 09:27:26.083511 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:26.083519 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:26.083529 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:26.140569 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:26.140588 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:26.158554 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:26.158571 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:26.230573 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:26.222615   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:26.223218   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:26.224819   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:26.225472   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:26.226976   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:26.222615   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:26.223218   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:26.224819   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:26.225472   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:26.226976   12645 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:26.230583 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:26.230594 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:26.292417 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:26.292436 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:28.819944 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:28.830528 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:28.830587 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:28.854228 1707070 cri.go:89] found id: ""
	I1124 09:27:28.854243 1707070 logs.go:282] 0 containers: []
	W1124 09:27:28.854250 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:28.854260 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:28.854324 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:28.891203 1707070 cri.go:89] found id: ""
	I1124 09:27:28.891217 1707070 logs.go:282] 0 containers: []
	W1124 09:27:28.891224 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:28.891230 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:28.891305 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:28.918573 1707070 cri.go:89] found id: ""
	I1124 09:27:28.918587 1707070 logs.go:282] 0 containers: []
	W1124 09:27:28.918594 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:28.918600 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:28.918665 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:28.944672 1707070 cri.go:89] found id: ""
	I1124 09:27:28.944685 1707070 logs.go:282] 0 containers: []
	W1124 09:27:28.944692 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:28.944708 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:28.944763 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:28.970414 1707070 cri.go:89] found id: ""
	I1124 09:27:28.970429 1707070 logs.go:282] 0 containers: []
	W1124 09:27:28.970436 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:28.970441 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:28.970539 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:28.995438 1707070 cri.go:89] found id: ""
	I1124 09:27:28.995453 1707070 logs.go:282] 0 containers: []
	W1124 09:27:28.995460 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:28.995466 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:28.995526 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:29.023817 1707070 cri.go:89] found id: ""
	I1124 09:27:29.023832 1707070 logs.go:282] 0 containers: []
	W1124 09:27:29.023839 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:29.023847 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:29.023858 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:29.080316 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:29.080336 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:29.097486 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:29.097502 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:29.159875 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:29.151793   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:29.152163   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:29.153608   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:29.154019   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:29.155829   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:29.151793   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:29.152163   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:29.153608   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:29.154019   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:29.155829   12751 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:29.159888 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:29.159907 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:29.223729 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:29.223754 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:31.751641 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:31.761798 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:31.761859 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:31.788691 1707070 cri.go:89] found id: ""
	I1124 09:27:31.788705 1707070 logs.go:282] 0 containers: []
	W1124 09:27:31.788711 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:31.788717 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:31.788776 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:31.812359 1707070 cri.go:89] found id: ""
	I1124 09:27:31.812374 1707070 logs.go:282] 0 containers: []
	W1124 09:27:31.812382 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:31.812387 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:31.812450 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:31.837276 1707070 cri.go:89] found id: ""
	I1124 09:27:31.837289 1707070 logs.go:282] 0 containers: []
	W1124 09:27:31.837296 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:31.837302 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:31.837360 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:31.862818 1707070 cri.go:89] found id: ""
	I1124 09:27:31.862832 1707070 logs.go:282] 0 containers: []
	W1124 09:27:31.862840 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:31.862846 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:31.862903 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:31.904922 1707070 cri.go:89] found id: ""
	I1124 09:27:31.904936 1707070 logs.go:282] 0 containers: []
	W1124 09:27:31.904944 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:31.904950 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:31.905012 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:31.949580 1707070 cri.go:89] found id: ""
	I1124 09:27:31.949594 1707070 logs.go:282] 0 containers: []
	W1124 09:27:31.949601 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:31.949607 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:31.949661 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:31.975157 1707070 cri.go:89] found id: ""
	I1124 09:27:31.975171 1707070 logs.go:282] 0 containers: []
	W1124 09:27:31.975178 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:31.975187 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:31.975198 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:32.004216 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:32.004239 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:32.064444 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:32.064466 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:32.084210 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:32.084229 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:32.152949 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:32.144237   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:32.145124   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:32.147159   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:32.147890   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:32.148900   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:32.144237   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:32.145124   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:32.147159   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:32.147890   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:32.148900   12868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:32.152963 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:32.152975 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:34.714493 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:34.725033 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:34.725101 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:34.750339 1707070 cri.go:89] found id: ""
	I1124 09:27:34.750352 1707070 logs.go:282] 0 containers: []
	W1124 09:27:34.750359 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:34.750365 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:34.750422 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:34.776574 1707070 cri.go:89] found id: ""
	I1124 09:27:34.776588 1707070 logs.go:282] 0 containers: []
	W1124 09:27:34.776595 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:34.776600 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:34.776656 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:34.801274 1707070 cri.go:89] found id: ""
	I1124 09:27:34.801288 1707070 logs.go:282] 0 containers: []
	W1124 09:27:34.801295 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:34.801300 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:34.801355 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:34.828204 1707070 cri.go:89] found id: ""
	I1124 09:27:34.828217 1707070 logs.go:282] 0 containers: []
	W1124 09:27:34.828224 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:34.828230 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:34.828286 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:34.856488 1707070 cri.go:89] found id: ""
	I1124 09:27:34.856502 1707070 logs.go:282] 0 containers: []
	W1124 09:27:34.856509 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:34.856514 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:34.856571 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:34.882889 1707070 cri.go:89] found id: ""
	I1124 09:27:34.882903 1707070 logs.go:282] 0 containers: []
	W1124 09:27:34.882914 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:34.882919 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:34.882988 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:34.914562 1707070 cri.go:89] found id: ""
	I1124 09:27:34.914576 1707070 logs.go:282] 0 containers: []
	W1124 09:27:34.914583 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:34.914591 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:34.914601 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:34.981562 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:34.981596 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:34.998925 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:34.998941 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:35.070877 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:35.062206   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:35.063028   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:35.064710   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:35.065308   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:35.067060   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:35.062206   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:35.063028   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:35.064710   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:35.065308   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:35.067060   12962 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:35.070899 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:35.070909 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:35.137172 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:35.137193 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:37.666865 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:37.677121 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:37.677182 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:37.702376 1707070 cri.go:89] found id: ""
	I1124 09:27:37.702390 1707070 logs.go:282] 0 containers: []
	W1124 09:27:37.702398 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:37.702407 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:37.702491 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:37.727342 1707070 cri.go:89] found id: ""
	I1124 09:27:37.727355 1707070 logs.go:282] 0 containers: []
	W1124 09:27:37.727363 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:37.727368 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:37.727430 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:37.753323 1707070 cri.go:89] found id: ""
	I1124 09:27:37.753336 1707070 logs.go:282] 0 containers: []
	W1124 09:27:37.753343 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:37.753349 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:37.753409 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:37.781020 1707070 cri.go:89] found id: ""
	I1124 09:27:37.781041 1707070 logs.go:282] 0 containers: []
	W1124 09:27:37.781049 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:37.781055 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:37.781117 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:37.805925 1707070 cri.go:89] found id: ""
	I1124 09:27:37.805939 1707070 logs.go:282] 0 containers: []
	W1124 09:27:37.805946 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:37.805952 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:37.806013 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:37.833036 1707070 cri.go:89] found id: ""
	I1124 09:27:37.833062 1707070 logs.go:282] 0 containers: []
	W1124 09:27:37.833069 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:37.833075 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:37.833140 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:37.860115 1707070 cri.go:89] found id: ""
	I1124 09:27:37.860129 1707070 logs.go:282] 0 containers: []
	W1124 09:27:37.860137 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:37.860145 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:37.860156 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:37.926098 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:37.926118 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:37.960030 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:37.960045 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:38.019375 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:38.019395 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:38.039066 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:38.039085 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:38.110062 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:38.101570   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:38.102692   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:38.104495   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:38.105053   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:38.106366   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:38.101570   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:38.102692   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:38.104495   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:38.105053   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:38.106366   13079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:40.610482 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:40.620402 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:40.620472 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:40.648289 1707070 cri.go:89] found id: ""
	I1124 09:27:40.648303 1707070 logs.go:282] 0 containers: []
	W1124 09:27:40.648311 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:40.648317 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:40.648373 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:40.672588 1707070 cri.go:89] found id: ""
	I1124 09:27:40.672603 1707070 logs.go:282] 0 containers: []
	W1124 09:27:40.672610 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:40.672616 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:40.672673 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:40.700039 1707070 cri.go:89] found id: ""
	I1124 09:27:40.700053 1707070 logs.go:282] 0 containers: []
	W1124 09:27:40.700060 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:40.700066 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:40.700129 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:40.728494 1707070 cri.go:89] found id: ""
	I1124 09:27:40.728508 1707070 logs.go:282] 0 containers: []
	W1124 09:27:40.728516 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:40.728522 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:40.728582 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:40.753773 1707070 cri.go:89] found id: ""
	I1124 09:27:40.753786 1707070 logs.go:282] 0 containers: []
	W1124 09:27:40.753793 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:40.753798 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:40.753860 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:40.778243 1707070 cri.go:89] found id: ""
	I1124 09:27:40.778257 1707070 logs.go:282] 0 containers: []
	W1124 09:27:40.778264 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:40.778270 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:40.778333 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:40.804316 1707070 cri.go:89] found id: ""
	I1124 09:27:40.804329 1707070 logs.go:282] 0 containers: []
	W1124 09:27:40.804350 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:40.804358 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:40.804370 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:40.821314 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:40.821330 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:40.901213 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:40.878028   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:40.878824   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:40.894654   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:40.895170   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:40.896920   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:40.878028   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:40.878824   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:40.894654   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:40.895170   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:40.896920   13164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:40.901232 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:40.901242 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:40.972785 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:40.972806 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:41.000947 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:41.000967 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:43.560416 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:43.570821 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:43.570882 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:43.595557 1707070 cri.go:89] found id: ""
	I1124 09:27:43.595571 1707070 logs.go:282] 0 containers: []
	W1124 09:27:43.595579 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:43.595585 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:43.595640 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:43.623980 1707070 cri.go:89] found id: ""
	I1124 09:27:43.623996 1707070 logs.go:282] 0 containers: []
	W1124 09:27:43.624003 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:43.624008 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:43.624074 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:43.649674 1707070 cri.go:89] found id: ""
	I1124 09:27:43.649688 1707070 logs.go:282] 0 containers: []
	W1124 09:27:43.649695 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:43.649701 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:43.649758 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:43.673375 1707070 cri.go:89] found id: ""
	I1124 09:27:43.673388 1707070 logs.go:282] 0 containers: []
	W1124 09:27:43.673397 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:43.673403 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:43.673459 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:43.700917 1707070 cri.go:89] found id: ""
	I1124 09:27:43.700931 1707070 logs.go:282] 0 containers: []
	W1124 09:27:43.700938 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:43.700943 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:43.701000 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:43.725453 1707070 cri.go:89] found id: ""
	I1124 09:27:43.725467 1707070 logs.go:282] 0 containers: []
	W1124 09:27:43.725481 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:43.725487 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:43.725557 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:43.755304 1707070 cri.go:89] found id: ""
	I1124 09:27:43.755318 1707070 logs.go:282] 0 containers: []
	W1124 09:27:43.755326 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:43.755335 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:43.755346 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:43.772549 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:43.772567 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:43.837565 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:43.829378   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:43.829969   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:43.831587   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:43.832265   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:43.833938   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:43.829378   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:43.829969   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:43.831587   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:43.832265   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:43.833938   13269 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:43.837575 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:43.837587 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:43.898949 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:43.898969 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:43.934259 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:43.934277 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:46.497111 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:46.507177 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:46.507251 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:46.531012 1707070 cri.go:89] found id: ""
	I1124 09:27:46.531025 1707070 logs.go:282] 0 containers: []
	W1124 09:27:46.531032 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:46.531038 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:46.531101 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:46.555781 1707070 cri.go:89] found id: ""
	I1124 09:27:46.555795 1707070 logs.go:282] 0 containers: []
	W1124 09:27:46.555802 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:46.555807 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:46.555864 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:46.580956 1707070 cri.go:89] found id: ""
	I1124 09:27:46.580974 1707070 logs.go:282] 0 containers: []
	W1124 09:27:46.580982 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:46.580987 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:46.581055 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:46.606320 1707070 cri.go:89] found id: ""
	I1124 09:27:46.606333 1707070 logs.go:282] 0 containers: []
	W1124 09:27:46.606340 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:46.606346 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:46.606414 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:46.632671 1707070 cri.go:89] found id: ""
	I1124 09:27:46.632685 1707070 logs.go:282] 0 containers: []
	W1124 09:27:46.632692 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:46.632697 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:46.632755 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:46.656948 1707070 cri.go:89] found id: ""
	I1124 09:27:46.656962 1707070 logs.go:282] 0 containers: []
	W1124 09:27:46.656969 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:46.656975 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:46.657037 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:46.681897 1707070 cri.go:89] found id: ""
	I1124 09:27:46.681910 1707070 logs.go:282] 0 containers: []
	W1124 09:27:46.681917 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:46.681925 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:46.681936 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:46.698822 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:46.698839 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:46.763473 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:46.755294   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:46.755864   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:46.757448   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:46.758065   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:46.759847   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:46.755294   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:46.755864   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:46.757448   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:46.758065   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:46.759847   13373 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:46.763499 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:46.763510 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:46.826271 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:46.826293 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:46.855001 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:46.855017 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:49.412865 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:49.423511 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:49.423574 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:49.447618 1707070 cri.go:89] found id: ""
	I1124 09:27:49.447632 1707070 logs.go:282] 0 containers: []
	W1124 09:27:49.447639 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:49.447645 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:49.447705 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:49.476127 1707070 cri.go:89] found id: ""
	I1124 09:27:49.476140 1707070 logs.go:282] 0 containers: []
	W1124 09:27:49.476147 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:49.476154 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:49.476213 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:49.501684 1707070 cri.go:89] found id: ""
	I1124 09:27:49.501697 1707070 logs.go:282] 0 containers: []
	W1124 09:27:49.501705 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:49.501711 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:49.501771 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:49.527011 1707070 cri.go:89] found id: ""
	I1124 09:27:49.527025 1707070 logs.go:282] 0 containers: []
	W1124 09:27:49.527033 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:49.527038 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:49.527098 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:49.552026 1707070 cri.go:89] found id: ""
	I1124 09:27:49.552040 1707070 logs.go:282] 0 containers: []
	W1124 09:27:49.552047 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:49.552053 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:49.552110 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:49.582162 1707070 cri.go:89] found id: ""
	I1124 09:27:49.582189 1707070 logs.go:282] 0 containers: []
	W1124 09:27:49.582196 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:49.582202 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:49.582275 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:49.612653 1707070 cri.go:89] found id: ""
	I1124 09:27:49.612667 1707070 logs.go:282] 0 containers: []
	W1124 09:27:49.612675 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:49.612683 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:49.612693 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:49.668483 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:49.668504 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:49.685463 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:49.685480 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:49.750076 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:49.741868   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:49.742309   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:49.744083   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:49.744608   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:49.746375   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:49.741868   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:49.742309   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:49.744083   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:49.744608   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:49.746375   13477 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:49.750136 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:49.750148 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:49.811614 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:49.811634 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:52.341239 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:52.351722 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:52.351784 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:52.378388 1707070 cri.go:89] found id: ""
	I1124 09:27:52.378402 1707070 logs.go:282] 0 containers: []
	W1124 09:27:52.378410 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:52.378416 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:52.378498 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:52.404052 1707070 cri.go:89] found id: ""
	I1124 09:27:52.404067 1707070 logs.go:282] 0 containers: []
	W1124 09:27:52.404074 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:52.404079 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:52.404138 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:52.428854 1707070 cri.go:89] found id: ""
	I1124 09:27:52.428868 1707070 logs.go:282] 0 containers: []
	W1124 09:27:52.428876 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:52.428882 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:52.428945 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:52.460795 1707070 cri.go:89] found id: ""
	I1124 09:27:52.460808 1707070 logs.go:282] 0 containers: []
	W1124 09:27:52.460815 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:52.460825 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:52.460886 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:52.490351 1707070 cri.go:89] found id: ""
	I1124 09:27:52.490365 1707070 logs.go:282] 0 containers: []
	W1124 09:27:52.490372 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:52.490378 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:52.490438 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:52.515789 1707070 cri.go:89] found id: ""
	I1124 09:27:52.515804 1707070 logs.go:282] 0 containers: []
	W1124 09:27:52.515811 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:52.515816 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:52.515874 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:52.544304 1707070 cri.go:89] found id: ""
	I1124 09:27:52.544318 1707070 logs.go:282] 0 containers: []
	W1124 09:27:52.544326 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:52.544335 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:52.544347 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:52.611718 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:52.603411   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:52.604016   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:52.605628   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:52.606175   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:52.607864   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:52.603411   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:52.604016   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:52.605628   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:52.606175   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:52.607864   13577 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:52.611731 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:52.611743 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:52.679720 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:52.679740 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:52.708422 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:52.708437 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:52.766414 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:52.766433 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:55.285861 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:55.296023 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:55.296086 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:55.324396 1707070 cri.go:89] found id: ""
	I1124 09:27:55.324409 1707070 logs.go:282] 0 containers: []
	W1124 09:27:55.324417 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:55.324422 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:55.324478 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:55.348746 1707070 cri.go:89] found id: ""
	I1124 09:27:55.348760 1707070 logs.go:282] 0 containers: []
	W1124 09:27:55.348767 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:55.348773 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:55.348832 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:55.373685 1707070 cri.go:89] found id: ""
	I1124 09:27:55.373710 1707070 logs.go:282] 0 containers: []
	W1124 09:27:55.373718 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:55.373724 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:55.373780 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:55.399757 1707070 cri.go:89] found id: ""
	I1124 09:27:55.399774 1707070 logs.go:282] 0 containers: []
	W1124 09:27:55.399783 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:55.399789 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:55.399848 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:55.424773 1707070 cri.go:89] found id: ""
	I1124 09:27:55.424788 1707070 logs.go:282] 0 containers: []
	W1124 09:27:55.424795 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:55.424800 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:55.424862 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:55.450083 1707070 cri.go:89] found id: ""
	I1124 09:27:55.450097 1707070 logs.go:282] 0 containers: []
	W1124 09:27:55.450104 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:55.450112 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:55.450170 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:55.474225 1707070 cri.go:89] found id: ""
	I1124 09:27:55.474239 1707070 logs.go:282] 0 containers: []
	W1124 09:27:55.474247 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:55.474254 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:55.474264 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:27:55.507455 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:55.507477 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:55.563391 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:55.563414 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:55.583115 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:55.583131 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:55.648979 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:55.641409   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:55.642033   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:55.643543   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:55.644021   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:55.645529   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:55.641409   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:55.642033   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:55.643543   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:55.644021   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:55.645529   13700 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:55.648991 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:55.649004 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:58.210584 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:27:58.221285 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:27:58.221351 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:27:58.250526 1707070 cri.go:89] found id: ""
	I1124 09:27:58.250541 1707070 logs.go:282] 0 containers: []
	W1124 09:27:58.250548 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:27:58.250554 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:27:58.250612 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:27:58.275099 1707070 cri.go:89] found id: ""
	I1124 09:27:58.275116 1707070 logs.go:282] 0 containers: []
	W1124 09:27:58.275123 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:27:58.275129 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:27:58.275189 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:27:58.300058 1707070 cri.go:89] found id: ""
	I1124 09:27:58.300075 1707070 logs.go:282] 0 containers: []
	W1124 09:27:58.300082 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:27:58.300087 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:27:58.300148 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:27:58.323564 1707070 cri.go:89] found id: ""
	I1124 09:27:58.323578 1707070 logs.go:282] 0 containers: []
	W1124 09:27:58.323585 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:27:58.323591 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:27:58.323648 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:27:58.348441 1707070 cri.go:89] found id: ""
	I1124 09:27:58.348455 1707070 logs.go:282] 0 containers: []
	W1124 09:27:58.348463 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:27:58.348468 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:27:58.348527 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:27:58.374283 1707070 cri.go:89] found id: ""
	I1124 09:27:58.374297 1707070 logs.go:282] 0 containers: []
	W1124 09:27:58.374305 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:27:58.374310 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:27:58.374371 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:27:58.400624 1707070 cri.go:89] found id: ""
	I1124 09:27:58.400638 1707070 logs.go:282] 0 containers: []
	W1124 09:27:58.400645 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:27:58.400653 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:27:58.400664 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:27:58.457055 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:27:58.457075 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:27:58.474204 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:27:58.474236 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:27:58.538738 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:27:58.530985   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:58.531628   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:58.533238   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:58.533555   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:58.535049   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:27:58.530985   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:58.531628   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:58.533238   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:58.533555   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:27:58.535049   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:27:58.538748 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:27:58.538761 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:27:58.601043 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:27:58.601064 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:01.129158 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:01.152628 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:01.152709 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:01.199688 1707070 cri.go:89] found id: ""
	I1124 09:28:01.199703 1707070 logs.go:282] 0 containers: []
	W1124 09:28:01.199710 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:01.199716 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:01.199778 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:01.226293 1707070 cri.go:89] found id: ""
	I1124 09:28:01.226307 1707070 logs.go:282] 0 containers: []
	W1124 09:28:01.226314 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:01.226319 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:01.226379 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:01.252021 1707070 cri.go:89] found id: ""
	I1124 09:28:01.252036 1707070 logs.go:282] 0 containers: []
	W1124 09:28:01.252043 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:01.252049 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:01.252108 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:01.278563 1707070 cri.go:89] found id: ""
	I1124 09:28:01.278577 1707070 logs.go:282] 0 containers: []
	W1124 09:28:01.278585 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:01.278591 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:01.278697 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:01.304781 1707070 cri.go:89] found id: ""
	I1124 09:28:01.304808 1707070 logs.go:282] 0 containers: []
	W1124 09:28:01.304816 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:01.304822 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:01.304900 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:01.330549 1707070 cri.go:89] found id: ""
	I1124 09:28:01.330574 1707070 logs.go:282] 0 containers: []
	W1124 09:28:01.330581 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:01.330586 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:01.330657 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:01.355624 1707070 cri.go:89] found id: ""
	I1124 09:28:01.355646 1707070 logs.go:282] 0 containers: []
	W1124 09:28:01.355654 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:01.355661 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:01.355673 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:01.411485 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:01.411504 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:01.428912 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:01.428927 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:01.493859 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:01.485758   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:01.486490   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:01.488127   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:01.488656   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:01.490257   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:01.485758   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:01.486490   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:01.488127   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:01.488656   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:01.490257   13893 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:01.493881 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:01.493892 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:01.554787 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:01.554808 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:04.088481 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:04.099124 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:04.099191 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:04.123836 1707070 cri.go:89] found id: ""
	I1124 09:28:04.123849 1707070 logs.go:282] 0 containers: []
	W1124 09:28:04.123857 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:04.123862 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:04.123927 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:04.159485 1707070 cri.go:89] found id: ""
	I1124 09:28:04.159499 1707070 logs.go:282] 0 containers: []
	W1124 09:28:04.159506 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:04.159511 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:04.159572 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:04.187075 1707070 cri.go:89] found id: ""
	I1124 09:28:04.187089 1707070 logs.go:282] 0 containers: []
	W1124 09:28:04.187106 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:04.187112 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:04.187169 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:04.217664 1707070 cri.go:89] found id: ""
	I1124 09:28:04.217677 1707070 logs.go:282] 0 containers: []
	W1124 09:28:04.217696 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:04.217702 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:04.217769 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:04.244060 1707070 cri.go:89] found id: ""
	I1124 09:28:04.244075 1707070 logs.go:282] 0 containers: []
	W1124 09:28:04.244082 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:04.244087 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:04.244151 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:04.269297 1707070 cri.go:89] found id: ""
	I1124 09:28:04.269311 1707070 logs.go:282] 0 containers: []
	W1124 09:28:04.269318 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:04.269323 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:04.269382 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:04.296714 1707070 cri.go:89] found id: ""
	I1124 09:28:04.296730 1707070 logs.go:282] 0 containers: []
	W1124 09:28:04.296737 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:04.296745 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:04.296760 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:04.352538 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:04.352558 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:04.370334 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:04.370357 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:04.439006 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:04.429890   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:04.430808   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:04.432656   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:04.433242   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:04.435153   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:04.429890   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:04.430808   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:04.432656   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:04.433242   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:04.435153   13996 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:04.439018 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:04.439027 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:04.503050 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:04.503072 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:07.038611 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:07.049789 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:07.049861 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:07.074863 1707070 cri.go:89] found id: ""
	I1124 09:28:07.074878 1707070 logs.go:282] 0 containers: []
	W1124 09:28:07.074885 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:07.074893 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:07.074950 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:07.099042 1707070 cri.go:89] found id: ""
	I1124 09:28:07.099057 1707070 logs.go:282] 0 containers: []
	W1124 09:28:07.099064 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:07.099070 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:07.099131 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:07.123608 1707070 cri.go:89] found id: ""
	I1124 09:28:07.123622 1707070 logs.go:282] 0 containers: []
	W1124 09:28:07.123630 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:07.123635 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:07.123706 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:07.151391 1707070 cri.go:89] found id: ""
	I1124 09:28:07.151405 1707070 logs.go:282] 0 containers: []
	W1124 09:28:07.151412 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:07.151418 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:07.151475 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:07.182488 1707070 cri.go:89] found id: ""
	I1124 09:28:07.182502 1707070 logs.go:282] 0 containers: []
	W1124 09:28:07.182510 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:07.182515 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:07.182581 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:07.207523 1707070 cri.go:89] found id: ""
	I1124 09:28:07.207537 1707070 logs.go:282] 0 containers: []
	W1124 09:28:07.207546 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:07.207552 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:07.207614 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:07.233412 1707070 cri.go:89] found id: ""
	I1124 09:28:07.233426 1707070 logs.go:282] 0 containers: []
	W1124 09:28:07.233433 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:07.233441 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:07.233451 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:07.288900 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:07.288922 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:07.306472 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:07.306493 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:07.368097 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:07.360574   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:07.360956   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:07.362483   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:07.362820   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:07.364269   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:07.360574   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:07.360956   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:07.362483   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:07.362820   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:07.364269   14102 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:07.368108 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:07.368121 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:07.429983 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:07.430002 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:09.965289 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:09.976378 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:09.976448 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:10.015687 1707070 cri.go:89] found id: ""
	I1124 09:28:10.015705 1707070 logs.go:282] 0 containers: []
	W1124 09:28:10.015714 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:10.015721 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:10.015811 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:10.042717 1707070 cri.go:89] found id: ""
	I1124 09:28:10.042731 1707070 logs.go:282] 0 containers: []
	W1124 09:28:10.042738 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:10.042743 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:10.042805 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:10.069226 1707070 cri.go:89] found id: ""
	I1124 09:28:10.069240 1707070 logs.go:282] 0 containers: []
	W1124 09:28:10.069259 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:10.069265 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:10.069336 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:10.094576 1707070 cri.go:89] found id: ""
	I1124 09:28:10.094591 1707070 logs.go:282] 0 containers: []
	W1124 09:28:10.094599 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:10.094604 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:10.094683 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:10.120910 1707070 cri.go:89] found id: ""
	I1124 09:28:10.120925 1707070 logs.go:282] 0 containers: []
	W1124 09:28:10.120932 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:10.120938 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:10.121007 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:10.148454 1707070 cri.go:89] found id: ""
	I1124 09:28:10.148467 1707070 logs.go:282] 0 containers: []
	W1124 09:28:10.148476 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:10.148482 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:10.148545 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:10.180342 1707070 cri.go:89] found id: ""
	I1124 09:28:10.180356 1707070 logs.go:282] 0 containers: []
	W1124 09:28:10.180363 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:10.180377 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:10.180387 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:10.237982 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:10.238001 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:10.254875 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:10.254891 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:10.315902 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:10.307876   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:10.308640   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:10.310183   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:10.310727   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:10.312228   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:10.307876   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:10.308640   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:10.310183   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:10.310727   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:10.312228   14207 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:10.315912 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:10.315922 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:10.381257 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:10.381276 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:12.913595 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:12.923674 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:12.923734 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:12.947804 1707070 cri.go:89] found id: ""
	I1124 09:28:12.947818 1707070 logs.go:282] 0 containers: []
	W1124 09:28:12.947826 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:12.947832 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:12.947892 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:12.971923 1707070 cri.go:89] found id: ""
	I1124 09:28:12.971937 1707070 logs.go:282] 0 containers: []
	W1124 09:28:12.971944 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:12.971956 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:12.972017 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:12.996325 1707070 cri.go:89] found id: ""
	I1124 09:28:12.996339 1707070 logs.go:282] 0 containers: []
	W1124 09:28:12.996357 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:12.996364 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:12.996436 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:13.022187 1707070 cri.go:89] found id: ""
	I1124 09:28:13.022203 1707070 logs.go:282] 0 containers: []
	W1124 09:28:13.022211 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:13.022224 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:13.022296 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:13.048161 1707070 cri.go:89] found id: ""
	I1124 09:28:13.048184 1707070 logs.go:282] 0 containers: []
	W1124 09:28:13.048192 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:13.048198 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:13.048262 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:13.073539 1707070 cri.go:89] found id: ""
	I1124 09:28:13.073564 1707070 logs.go:282] 0 containers: []
	W1124 09:28:13.073571 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:13.073578 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:13.073655 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:13.098089 1707070 cri.go:89] found id: ""
	I1124 09:28:13.098106 1707070 logs.go:282] 0 containers: []
	W1124 09:28:13.098114 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:13.098122 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:13.098132 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:13.140239 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:13.140255 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:13.197847 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:13.197865 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:13.217667 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:13.217686 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:13.281312 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:13.272865   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:13.273748   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:13.275370   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:13.275717   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:13.277237   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:13.272865   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:13.273748   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:13.275370   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:13.275717   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:13.277237   14321 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:13.281322 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:13.281334 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:15.842684 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:15.853250 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:15.853311 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:15.878981 1707070 cri.go:89] found id: ""
	I1124 09:28:15.878995 1707070 logs.go:282] 0 containers: []
	W1124 09:28:15.879030 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:15.879036 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:15.879099 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:15.904674 1707070 cri.go:89] found id: ""
	I1124 09:28:15.904687 1707070 logs.go:282] 0 containers: []
	W1124 09:28:15.904695 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:15.904700 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:15.904757 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:15.929766 1707070 cri.go:89] found id: ""
	I1124 09:28:15.929780 1707070 logs.go:282] 0 containers: []
	W1124 09:28:15.929787 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:15.929793 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:15.929851 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:15.955453 1707070 cri.go:89] found id: ""
	I1124 09:28:15.955468 1707070 logs.go:282] 0 containers: []
	W1124 09:28:15.955475 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:15.955485 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:15.955543 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:15.983839 1707070 cri.go:89] found id: ""
	I1124 09:28:15.983854 1707070 logs.go:282] 0 containers: []
	W1124 09:28:15.983861 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:15.983866 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:15.983924 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:16.014730 1707070 cri.go:89] found id: ""
	I1124 09:28:16.014744 1707070 logs.go:282] 0 containers: []
	W1124 09:28:16.014752 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:16.014757 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:16.014820 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:16.046753 1707070 cri.go:89] found id: ""
	I1124 09:28:16.046767 1707070 logs.go:282] 0 containers: []
	W1124 09:28:16.046775 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:16.046783 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:16.046794 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:16.064199 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:16.064217 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:16.139691 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:16.122247   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:16.122923   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:16.124768   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:16.125231   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:16.126838   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:16.122247   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:16.122923   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:16.124768   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:16.125231   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:16.126838   14407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:16.139701 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:16.139711 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:16.206802 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:16.206822 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:16.234674 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:16.234690 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:18.790282 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:18.801848 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:18.801912 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:18.827821 1707070 cri.go:89] found id: ""
	I1124 09:28:18.827836 1707070 logs.go:282] 0 containers: []
	W1124 09:28:18.827843 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:18.827849 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:18.827905 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:18.852169 1707070 cri.go:89] found id: ""
	I1124 09:28:18.852184 1707070 logs.go:282] 0 containers: []
	W1124 09:28:18.852191 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:18.852196 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:18.852253 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:18.878610 1707070 cri.go:89] found id: ""
	I1124 09:28:18.878625 1707070 logs.go:282] 0 containers: []
	W1124 09:28:18.878633 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:18.878638 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:18.878702 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:18.903384 1707070 cri.go:89] found id: ""
	I1124 09:28:18.903403 1707070 logs.go:282] 0 containers: []
	W1124 09:28:18.903410 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:18.903416 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:18.903476 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:18.928519 1707070 cri.go:89] found id: ""
	I1124 09:28:18.928534 1707070 logs.go:282] 0 containers: []
	W1124 09:28:18.928542 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:18.928547 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:18.928609 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:18.956808 1707070 cri.go:89] found id: ""
	I1124 09:28:18.956823 1707070 logs.go:282] 0 containers: []
	W1124 09:28:18.956830 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:18.956836 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:18.956893 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:18.985113 1707070 cri.go:89] found id: ""
	I1124 09:28:18.985127 1707070 logs.go:282] 0 containers: []
	W1124 09:28:18.985134 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:18.985142 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:18.985152 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:19.019130 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:19.019146 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:19.075193 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:19.075213 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:19.092291 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:19.092306 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:19.162819 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:19.154959   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:19.155361   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:19.156834   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:19.157156   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:19.158629   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:19.154959   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:19.155361   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:19.156834   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:19.157156   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:19.158629   14526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:19.162839 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:19.162850 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:21.737895 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:21.748053 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:21.748120 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:21.773590 1707070 cri.go:89] found id: ""
	I1124 09:28:21.773604 1707070 logs.go:282] 0 containers: []
	W1124 09:28:21.773611 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:21.773618 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:21.773679 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:21.800809 1707070 cri.go:89] found id: ""
	I1124 09:28:21.800866 1707070 logs.go:282] 0 containers: []
	W1124 09:28:21.800874 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:21.800880 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:21.800938 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:21.826581 1707070 cri.go:89] found id: ""
	I1124 09:28:21.826594 1707070 logs.go:282] 0 containers: []
	W1124 09:28:21.826602 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:21.826607 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:21.826668 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:21.856267 1707070 cri.go:89] found id: ""
	I1124 09:28:21.856282 1707070 logs.go:282] 0 containers: []
	W1124 09:28:21.856289 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:21.856295 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:21.856354 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:21.885138 1707070 cri.go:89] found id: ""
	I1124 09:28:21.885152 1707070 logs.go:282] 0 containers: []
	W1124 09:28:21.885160 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:21.885165 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:21.885224 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:21.909643 1707070 cri.go:89] found id: ""
	I1124 09:28:21.909657 1707070 logs.go:282] 0 containers: []
	W1124 09:28:21.909665 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:21.909671 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:21.909727 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:21.936792 1707070 cri.go:89] found id: ""
	I1124 09:28:21.936806 1707070 logs.go:282] 0 containers: []
	W1124 09:28:21.936813 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:21.936821 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:21.936831 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:21.993870 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:21.993890 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:22.011453 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:22.011474 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:22.078376 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:22.069998   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:22.070791   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:22.072423   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:22.073020   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:22.074616   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:22.069998   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:22.070791   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:22.072423   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:22.073020   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:22.074616   14620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:22.078387 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:22.078398 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:22.140934 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:22.140953 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:24.669313 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:24.679257 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:24.679328 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:24.707632 1707070 cri.go:89] found id: ""
	I1124 09:28:24.707647 1707070 logs.go:282] 0 containers: []
	W1124 09:28:24.707654 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:24.707660 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:24.707720 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:24.733688 1707070 cri.go:89] found id: ""
	I1124 09:28:24.733702 1707070 logs.go:282] 0 containers: []
	W1124 09:28:24.733710 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:24.733715 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:24.733773 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:24.759056 1707070 cri.go:89] found id: ""
	I1124 09:28:24.759071 1707070 logs.go:282] 0 containers: []
	W1124 09:28:24.759078 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:24.759084 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:24.759143 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:24.789918 1707070 cri.go:89] found id: ""
	I1124 09:28:24.789931 1707070 logs.go:282] 0 containers: []
	W1124 09:28:24.789938 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:24.789944 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:24.790003 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:24.814684 1707070 cri.go:89] found id: ""
	I1124 09:28:24.814698 1707070 logs.go:282] 0 containers: []
	W1124 09:28:24.814709 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:24.814714 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:24.814773 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:24.839467 1707070 cri.go:89] found id: ""
	I1124 09:28:24.839489 1707070 logs.go:282] 0 containers: []
	W1124 09:28:24.839497 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:24.839503 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:24.839568 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:24.863902 1707070 cri.go:89] found id: ""
	I1124 09:28:24.863917 1707070 logs.go:282] 0 containers: []
	W1124 09:28:24.863925 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:24.863933 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:24.863943 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:24.919300 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:24.919320 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:24.936150 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:24.936167 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:24.998414 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:24.990181   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:24.990882   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:24.992541   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:24.993206   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:24.994900   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:24.990181   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:24.990882   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:24.992541   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:24.993206   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:24.994900   14723 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:24.998425 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:24.998435 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:25.062735 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:25.062756 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:27.591381 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:27.601598 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:27.601658 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:27.626062 1707070 cri.go:89] found id: ""
	I1124 09:28:27.626076 1707070 logs.go:282] 0 containers: []
	W1124 09:28:27.626084 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:27.626090 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:27.626152 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:27.654571 1707070 cri.go:89] found id: ""
	I1124 09:28:27.654591 1707070 logs.go:282] 0 containers: []
	W1124 09:28:27.654599 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:27.654604 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:27.654664 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:27.679294 1707070 cri.go:89] found id: ""
	I1124 09:28:27.679308 1707070 logs.go:282] 0 containers: []
	W1124 09:28:27.679315 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:27.679320 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:27.679377 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:27.702575 1707070 cri.go:89] found id: ""
	I1124 09:28:27.702588 1707070 logs.go:282] 0 containers: []
	W1124 09:28:27.702595 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:27.702601 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:27.702657 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:27.728251 1707070 cri.go:89] found id: ""
	I1124 09:28:27.728266 1707070 logs.go:282] 0 containers: []
	W1124 09:28:27.728273 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:27.728279 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:27.728339 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:27.752789 1707070 cri.go:89] found id: ""
	I1124 09:28:27.752802 1707070 logs.go:282] 0 containers: []
	W1124 09:28:27.752809 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:27.752815 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:27.752874 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:27.776833 1707070 cri.go:89] found id: ""
	I1124 09:28:27.776847 1707070 logs.go:282] 0 containers: []
	W1124 09:28:27.776854 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:27.776862 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:27.776871 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:27.837612 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:27.837637 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:27.866873 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:27.866890 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:27.925473 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:27.925492 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:27.942415 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:27.942432 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:28.014797 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:27.999267   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:28.000058   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:28.002028   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:28.002995   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:28.005197   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:27.999267   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:28.000058   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:28.002028   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:28.002995   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:28.005197   14840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:30.515707 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:30.526026 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:30.526102 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:30.550904 1707070 cri.go:89] found id: ""
	I1124 09:28:30.550918 1707070 logs.go:282] 0 containers: []
	W1124 09:28:30.550925 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:30.550931 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:30.550996 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:30.580837 1707070 cri.go:89] found id: ""
	I1124 09:28:30.580851 1707070 logs.go:282] 0 containers: []
	W1124 09:28:30.580859 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:30.580864 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:30.580920 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:30.605291 1707070 cri.go:89] found id: ""
	I1124 09:28:30.605305 1707070 logs.go:282] 0 containers: []
	W1124 09:28:30.605312 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:30.605318 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:30.605376 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:30.630158 1707070 cri.go:89] found id: ""
	I1124 09:28:30.630172 1707070 logs.go:282] 0 containers: []
	W1124 09:28:30.630181 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:30.630187 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:30.630254 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:30.653754 1707070 cri.go:89] found id: ""
	I1124 09:28:30.653772 1707070 logs.go:282] 0 containers: []
	W1124 09:28:30.653785 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:30.653790 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:30.653868 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:30.679137 1707070 cri.go:89] found id: ""
	I1124 09:28:30.679150 1707070 logs.go:282] 0 containers: []
	W1124 09:28:30.679157 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:30.679163 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:30.679221 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:30.703850 1707070 cri.go:89] found id: ""
	I1124 09:28:30.703864 1707070 logs.go:282] 0 containers: []
	W1124 09:28:30.703871 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:30.703879 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:30.703888 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:30.772547 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:30.764218   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:30.764926   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:30.766593   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:30.767134   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:30.768991   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:30.764218   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:30.764926   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:30.766593   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:30.767134   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:30.768991   14925 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:30.772557 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:30.772568 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:30.834024 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:30.834043 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:30.862031 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:30.862046 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:30.920292 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:30.920311 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:33.438606 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:33.448762 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:33.448822 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:33.481032 1707070 cri.go:89] found id: ""
	I1124 09:28:33.481046 1707070 logs.go:282] 0 containers: []
	W1124 09:28:33.481053 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:33.481060 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:33.481117 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:33.504561 1707070 cri.go:89] found id: ""
	I1124 09:28:33.504576 1707070 logs.go:282] 0 containers: []
	W1124 09:28:33.504583 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:33.504589 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:33.504654 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:33.528885 1707070 cri.go:89] found id: ""
	I1124 09:28:33.528899 1707070 logs.go:282] 0 containers: []
	W1124 09:28:33.528906 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:33.528915 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:33.528972 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:33.553244 1707070 cri.go:89] found id: ""
	I1124 09:28:33.553258 1707070 logs.go:282] 0 containers: []
	W1124 09:28:33.553271 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:33.553277 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:33.553334 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:33.578519 1707070 cri.go:89] found id: ""
	I1124 09:28:33.578533 1707070 logs.go:282] 0 containers: []
	W1124 09:28:33.578541 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:33.578546 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:33.578607 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:33.602708 1707070 cri.go:89] found id: ""
	I1124 09:28:33.602721 1707070 logs.go:282] 0 containers: []
	W1124 09:28:33.602729 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:33.602734 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:33.602791 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:33.626894 1707070 cri.go:89] found id: ""
	I1124 09:28:33.626908 1707070 logs.go:282] 0 containers: []
	W1124 09:28:33.626916 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:33.626923 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:33.626934 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:33.684867 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:33.684887 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:33.701817 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:33.701834 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:33.775161 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:33.766757   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:33.767480   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:33.769022   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:33.769484   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:33.770951   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:33.766757   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:33.767480   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:33.769022   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:33.769484   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:33.770951   15037 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:33.775172 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:33.775185 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:33.837667 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:33.837688 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:36.365266 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:36.376558 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:36.376622 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:36.412692 1707070 cri.go:89] found id: ""
	I1124 09:28:36.412706 1707070 logs.go:282] 0 containers: []
	W1124 09:28:36.412714 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:36.412719 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:36.412777 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:36.448943 1707070 cri.go:89] found id: ""
	I1124 09:28:36.448957 1707070 logs.go:282] 0 containers: []
	W1124 09:28:36.448964 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:36.448970 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:36.449031 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:36.474906 1707070 cri.go:89] found id: ""
	I1124 09:28:36.474920 1707070 logs.go:282] 0 containers: []
	W1124 09:28:36.474928 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:36.474934 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:36.474990 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:36.503770 1707070 cri.go:89] found id: ""
	I1124 09:28:36.503784 1707070 logs.go:282] 0 containers: []
	W1124 09:28:36.503792 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:36.503797 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:36.503863 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:36.532858 1707070 cri.go:89] found id: ""
	I1124 09:28:36.532872 1707070 logs.go:282] 0 containers: []
	W1124 09:28:36.532880 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:36.532885 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:36.532944 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:36.557874 1707070 cri.go:89] found id: ""
	I1124 09:28:36.557889 1707070 logs.go:282] 0 containers: []
	W1124 09:28:36.557896 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:36.557902 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:36.557959 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:36.582175 1707070 cri.go:89] found id: ""
	I1124 09:28:36.582189 1707070 logs.go:282] 0 containers: []
	W1124 09:28:36.582204 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:36.582212 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:36.582230 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:36.645586 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:36.637487   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:36.638140   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:36.639873   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:36.640429   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:36.641968   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:36.637487   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:36.638140   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:36.639873   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:36.640429   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:36.641968   15138 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:36.645596 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:36.645607 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:36.708211 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:36.708231 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:36.740877 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:36.740894 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:36.798376 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:36.798396 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:39.316746 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:39.327050 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:39.327111 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:39.351416 1707070 cri.go:89] found id: ""
	I1124 09:28:39.351430 1707070 logs.go:282] 0 containers: []
	W1124 09:28:39.351438 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:39.351444 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:39.351500 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:39.375341 1707070 cri.go:89] found id: ""
	I1124 09:28:39.375355 1707070 logs.go:282] 0 containers: []
	W1124 09:28:39.375362 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:39.375367 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:39.375425 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:39.402220 1707070 cri.go:89] found id: ""
	I1124 09:28:39.402235 1707070 logs.go:282] 0 containers: []
	W1124 09:28:39.402241 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:39.402247 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:39.402306 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:39.434081 1707070 cri.go:89] found id: ""
	I1124 09:28:39.434094 1707070 logs.go:282] 0 containers: []
	W1124 09:28:39.434101 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:39.434107 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:39.434167 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:39.467514 1707070 cri.go:89] found id: ""
	I1124 09:28:39.467528 1707070 logs.go:282] 0 containers: []
	W1124 09:28:39.467535 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:39.467540 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:39.467597 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:39.500947 1707070 cri.go:89] found id: ""
	I1124 09:28:39.500961 1707070 logs.go:282] 0 containers: []
	W1124 09:28:39.500968 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:39.500974 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:39.501034 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:39.526637 1707070 cri.go:89] found id: ""
	I1124 09:28:39.526651 1707070 logs.go:282] 0 containers: []
	W1124 09:28:39.526658 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:39.526666 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:39.526676 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:39.582247 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:39.582268 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:39.599751 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:39.599767 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:39.668271 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:39.660949   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:39.661446   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:39.663149   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:39.663643   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:39.664706   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:39.660949   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:39.661446   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:39.663149   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:39.663643   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:39.664706   15251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:39.668281 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:39.668294 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:39.730931 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:39.730951 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:42.260305 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:42.272405 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:42.272489 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:42.300817 1707070 cri.go:89] found id: ""
	I1124 09:28:42.300842 1707070 logs.go:282] 0 containers: []
	W1124 09:28:42.300850 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:42.300856 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:42.300921 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:42.327350 1707070 cri.go:89] found id: ""
	I1124 09:28:42.327368 1707070 logs.go:282] 0 containers: []
	W1124 09:28:42.327377 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:42.327382 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:42.327441 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:42.352768 1707070 cri.go:89] found id: ""
	I1124 09:28:42.352781 1707070 logs.go:282] 0 containers: []
	W1124 09:28:42.352788 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:42.352794 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:42.352858 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:42.384996 1707070 cri.go:89] found id: ""
	I1124 09:28:42.385016 1707070 logs.go:282] 0 containers: []
	W1124 09:28:42.385024 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:42.385035 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:42.385109 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:42.433916 1707070 cri.go:89] found id: ""
	I1124 09:28:42.433942 1707070 logs.go:282] 0 containers: []
	W1124 09:28:42.433963 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:42.433974 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:42.434041 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:42.469962 1707070 cri.go:89] found id: ""
	I1124 09:28:42.469976 1707070 logs.go:282] 0 containers: []
	W1124 09:28:42.469983 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:42.469989 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:42.470045 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:42.494905 1707070 cri.go:89] found id: ""
	I1124 09:28:42.494919 1707070 logs.go:282] 0 containers: []
	W1124 09:28:42.494926 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:42.494934 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:42.494944 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:42.551276 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:42.551295 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:42.568521 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:42.568538 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:42.631652 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:42.623578   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:42.624203   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:42.625718   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:42.626134   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:42.627653   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:42.623578   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:42.624203   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:42.625718   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:42.626134   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:42.627653   15356 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:42.631662 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:42.631689 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:42.697554 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:42.697573 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:45.228012 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:45.242540 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:45.242663 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:45.285651 1707070 cri.go:89] found id: ""
	I1124 09:28:45.285666 1707070 logs.go:282] 0 containers: []
	W1124 09:28:45.285673 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:45.285679 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:45.285747 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:45.315729 1707070 cri.go:89] found id: ""
	I1124 09:28:45.315744 1707070 logs.go:282] 0 containers: []
	W1124 09:28:45.315759 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:45.315766 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:45.315838 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:45.342027 1707070 cri.go:89] found id: ""
	I1124 09:28:45.342041 1707070 logs.go:282] 0 containers: []
	W1124 09:28:45.342048 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:45.342053 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:45.342112 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:45.368019 1707070 cri.go:89] found id: ""
	I1124 09:28:45.368033 1707070 logs.go:282] 0 containers: []
	W1124 09:28:45.368040 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:45.368046 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:45.368102 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:45.406091 1707070 cri.go:89] found id: ""
	I1124 09:28:45.406104 1707070 logs.go:282] 0 containers: []
	W1124 09:28:45.406112 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:45.406119 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:45.406176 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:45.432356 1707070 cri.go:89] found id: ""
	I1124 09:28:45.432369 1707070 logs.go:282] 0 containers: []
	W1124 09:28:45.432377 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:45.432382 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:45.432449 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:45.465291 1707070 cri.go:89] found id: ""
	I1124 09:28:45.465315 1707070 logs.go:282] 0 containers: []
	W1124 09:28:45.465324 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:45.465332 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:45.465345 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:45.527756 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:45.527784 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:45.544616 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:45.544642 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:45.606842 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:45.598345   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:45.599427   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:45.600949   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:45.601550   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:45.603105   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:45.598345   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:45.599427   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:45.600949   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:45.601550   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:45.603105   15463 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:45.606853 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:45.606866 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:45.669056 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:45.669077 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:48.198708 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:48.210384 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:48.210449 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:48.235268 1707070 cri.go:89] found id: ""
	I1124 09:28:48.235282 1707070 logs.go:282] 0 containers: []
	W1124 09:28:48.235289 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:48.235295 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:48.235357 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:48.261413 1707070 cri.go:89] found id: ""
	I1124 09:28:48.261427 1707070 logs.go:282] 0 containers: []
	W1124 09:28:48.261434 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:48.261439 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:48.261496 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:48.291100 1707070 cri.go:89] found id: ""
	I1124 09:28:48.291114 1707070 logs.go:282] 0 containers: []
	W1124 09:28:48.291122 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:48.291127 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:48.291186 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:48.326388 1707070 cri.go:89] found id: ""
	I1124 09:28:48.326412 1707070 logs.go:282] 0 containers: []
	W1124 09:28:48.326420 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:48.326426 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:48.326499 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:48.356212 1707070 cri.go:89] found id: ""
	I1124 09:28:48.356227 1707070 logs.go:282] 0 containers: []
	W1124 09:28:48.356234 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:48.356240 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:48.356299 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:48.384677 1707070 cri.go:89] found id: ""
	I1124 09:28:48.384690 1707070 logs.go:282] 0 containers: []
	W1124 09:28:48.384697 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:48.384703 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:48.384759 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:48.422001 1707070 cri.go:89] found id: ""
	I1124 09:28:48.422015 1707070 logs.go:282] 0 containers: []
	W1124 09:28:48.422022 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:48.422030 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:48.422040 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:48.492980 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:48.493001 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:48.522367 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:48.522383 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:48.577847 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:48.577866 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:48.594803 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:48.594821 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:48.662402 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:48.654176   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:48.655485   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:48.656131   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:48.657084   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:48.658755   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:48.654176   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:48.655485   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:48.656131   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:48.657084   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:48.658755   15580 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:51.162680 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:51.173802 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:51.173865 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:51.200124 1707070 cri.go:89] found id: ""
	I1124 09:28:51.200146 1707070 logs.go:282] 0 containers: []
	W1124 09:28:51.200155 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:51.200161 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:51.200220 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:51.225309 1707070 cri.go:89] found id: ""
	I1124 09:28:51.225323 1707070 logs.go:282] 0 containers: []
	W1124 09:28:51.225330 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:51.225335 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:51.225392 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:51.249971 1707070 cri.go:89] found id: ""
	I1124 09:28:51.249985 1707070 logs.go:282] 0 containers: []
	W1124 09:28:51.249992 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:51.249997 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:51.250053 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:51.275848 1707070 cri.go:89] found id: ""
	I1124 09:28:51.275861 1707070 logs.go:282] 0 containers: []
	W1124 09:28:51.275868 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:51.275874 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:51.275929 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:51.304356 1707070 cri.go:89] found id: ""
	I1124 09:28:51.304370 1707070 logs.go:282] 0 containers: []
	W1124 09:28:51.304386 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:51.304392 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:51.304450 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:51.329000 1707070 cri.go:89] found id: ""
	I1124 09:28:51.329015 1707070 logs.go:282] 0 containers: []
	W1124 09:28:51.329021 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:51.329027 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:51.329099 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:51.357783 1707070 cri.go:89] found id: ""
	I1124 09:28:51.357796 1707070 logs.go:282] 0 containers: []
	W1124 09:28:51.357804 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:51.357811 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:51.357820 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:51.426561 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:51.426582 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:51.456185 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:51.456202 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:51.512504 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:51.512525 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:51.530860 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:51.530877 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:51.596556 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:51.586703   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:51.587508   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:51.589233   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:51.589675   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:51.591800   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:51.586703   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:51.587508   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:51.589233   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:51.589675   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:51.591800   15685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:54.097448 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:54.107646 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:54.107710 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:54.131850 1707070 cri.go:89] found id: ""
	I1124 09:28:54.131869 1707070 logs.go:282] 0 containers: []
	W1124 09:28:54.131877 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:54.131883 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:54.131950 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:54.157778 1707070 cri.go:89] found id: ""
	I1124 09:28:54.157793 1707070 logs.go:282] 0 containers: []
	W1124 09:28:54.157800 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:54.157806 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:54.157871 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:54.183638 1707070 cri.go:89] found id: ""
	I1124 09:28:54.183661 1707070 logs.go:282] 0 containers: []
	W1124 09:28:54.183668 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:54.183676 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:54.183745 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:54.208654 1707070 cri.go:89] found id: ""
	I1124 09:28:54.208668 1707070 logs.go:282] 0 containers: []
	W1124 09:28:54.208675 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:54.208680 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:54.208741 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:54.237302 1707070 cri.go:89] found id: ""
	I1124 09:28:54.237317 1707070 logs.go:282] 0 containers: []
	W1124 09:28:54.237325 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:54.237331 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:54.237390 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:54.261089 1707070 cri.go:89] found id: ""
	I1124 09:28:54.261111 1707070 logs.go:282] 0 containers: []
	W1124 09:28:54.261119 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:54.261124 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:54.261195 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:54.289315 1707070 cri.go:89] found id: ""
	I1124 09:28:54.289337 1707070 logs.go:282] 0 containers: []
	W1124 09:28:54.289345 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:54.289353 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:54.289363 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:54.350840 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:54.350861 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:54.391880 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:54.391897 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:54.457044 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:54.457066 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:54.475507 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:54.475525 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:54.538358 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:54.529952   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:54.530805   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:54.531583   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:54.533115   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:54.533777   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:54.529952   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:54.530805   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:54.531583   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:54.533115   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:54.533777   15794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:57.040068 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:57.050642 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:57.050707 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:28:57.075811 1707070 cri.go:89] found id: ""
	I1124 09:28:57.075824 1707070 logs.go:282] 0 containers: []
	W1124 09:28:57.075832 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:28:57.075837 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:28:57.075899 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:28:57.106029 1707070 cri.go:89] found id: ""
	I1124 09:28:57.106044 1707070 logs.go:282] 0 containers: []
	W1124 09:28:57.106052 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:28:57.106058 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:28:57.106114 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:28:57.132742 1707070 cri.go:89] found id: ""
	I1124 09:28:57.132756 1707070 logs.go:282] 0 containers: []
	W1124 09:28:57.132763 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:28:57.132768 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:28:57.132825 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:28:57.156809 1707070 cri.go:89] found id: ""
	I1124 09:28:57.156823 1707070 logs.go:282] 0 containers: []
	W1124 09:28:57.156830 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:28:57.156835 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:28:57.156898 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:28:57.182649 1707070 cri.go:89] found id: ""
	I1124 09:28:57.182663 1707070 logs.go:282] 0 containers: []
	W1124 09:28:57.182670 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:28:57.182676 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:28:57.182733 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:28:57.206184 1707070 cri.go:89] found id: ""
	I1124 09:28:57.206198 1707070 logs.go:282] 0 containers: []
	W1124 09:28:57.206205 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:28:57.206211 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:28:57.206275 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:28:57.230629 1707070 cri.go:89] found id: ""
	I1124 09:28:57.230643 1707070 logs.go:282] 0 containers: []
	W1124 09:28:57.230651 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:28:57.230660 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:28:57.230670 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:28:57.287168 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:28:57.287187 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:28:57.304021 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:28:57.304037 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:28:57.368613 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:28:57.361126   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:57.361623   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:57.363259   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:57.363659   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:57.365140   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:28:57.361126   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:57.361623   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:57.363259   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:57.363659   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:28:57.365140   15882 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:28:57.368624 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:28:57.368635 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:28:57.439834 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:28:57.439854 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:28:59.971306 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:28:59.982006 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:28:59.982066 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:00.016934 1707070 cri.go:89] found id: ""
	I1124 09:29:00.016951 1707070 logs.go:282] 0 containers: []
	W1124 09:29:00.016966 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:00.016973 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:00.017049 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:00.103638 1707070 cri.go:89] found id: ""
	I1124 09:29:00.103654 1707070 logs.go:282] 0 containers: []
	W1124 09:29:00.103663 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:00.103669 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:00.103740 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:00.170246 1707070 cri.go:89] found id: ""
	I1124 09:29:00.170264 1707070 logs.go:282] 0 containers: []
	W1124 09:29:00.170273 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:00.170280 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:00.170350 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:00.236365 1707070 cri.go:89] found id: ""
	I1124 09:29:00.236382 1707070 logs.go:282] 0 containers: []
	W1124 09:29:00.236390 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:00.236397 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:00.236474 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:00.304007 1707070 cri.go:89] found id: ""
	I1124 09:29:00.304026 1707070 logs.go:282] 0 containers: []
	W1124 09:29:00.304036 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:00.304048 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:00.304139 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:00.347892 1707070 cri.go:89] found id: ""
	I1124 09:29:00.347907 1707070 logs.go:282] 0 containers: []
	W1124 09:29:00.347916 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:00.347924 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:00.348047 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:00.392276 1707070 cri.go:89] found id: ""
	I1124 09:29:00.392292 1707070 logs.go:282] 0 containers: []
	W1124 09:29:00.392304 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:00.392314 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:00.392328 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:00.445097 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:00.445118 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:00.507903 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:00.507923 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:00.532762 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:00.532787 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:00.603329 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:00.595058   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:00.595595   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:00.597748   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:00.598425   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:00.599635   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:00.595058   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:00.595595   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:00.597748   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:00.598425   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:00.599635   16004 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:00.603341 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:00.603352 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:03.164630 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:03.174868 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:03.174928 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:03.198952 1707070 cri.go:89] found id: ""
	I1124 09:29:03.198966 1707070 logs.go:282] 0 containers: []
	W1124 09:29:03.198973 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:03.198979 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:03.199038 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:03.228049 1707070 cri.go:89] found id: ""
	I1124 09:29:03.228063 1707070 logs.go:282] 0 containers: []
	W1124 09:29:03.228070 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:03.228075 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:03.228133 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:03.253873 1707070 cri.go:89] found id: ""
	I1124 09:29:03.253888 1707070 logs.go:282] 0 containers: []
	W1124 09:29:03.253895 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:03.253901 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:03.253969 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:03.277874 1707070 cri.go:89] found id: ""
	I1124 09:29:03.277889 1707070 logs.go:282] 0 containers: []
	W1124 09:29:03.277903 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:03.277909 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:03.277966 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:03.306311 1707070 cri.go:89] found id: ""
	I1124 09:29:03.306333 1707070 logs.go:282] 0 containers: []
	W1124 09:29:03.306340 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:03.306345 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:03.306402 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:03.330412 1707070 cri.go:89] found id: ""
	I1124 09:29:03.330425 1707070 logs.go:282] 0 containers: []
	W1124 09:29:03.330432 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:03.330438 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:03.330572 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:03.359087 1707070 cri.go:89] found id: ""
	I1124 09:29:03.359101 1707070 logs.go:282] 0 containers: []
	W1124 09:29:03.359108 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:03.359116 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:03.359125 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:03.430996 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:03.431015 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:03.467444 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:03.467460 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:03.526316 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:03.526336 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:03.543233 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:03.543250 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:03.605146 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:03.596435   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:03.597161   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:03.598917   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:03.599598   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:03.601425   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:03.596435   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:03.597161   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:03.598917   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:03.599598   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:03.601425   16110 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:06.105406 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:06.116034 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:06.116093 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:06.140111 1707070 cri.go:89] found id: ""
	I1124 09:29:06.140125 1707070 logs.go:282] 0 containers: []
	W1124 09:29:06.140132 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:06.140137 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:06.140195 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:06.164893 1707070 cri.go:89] found id: ""
	I1124 09:29:06.164907 1707070 logs.go:282] 0 containers: []
	W1124 09:29:06.164914 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:06.164920 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:06.164979 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:06.190122 1707070 cri.go:89] found id: ""
	I1124 09:29:06.190137 1707070 logs.go:282] 0 containers: []
	W1124 09:29:06.190144 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:06.190149 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:06.190206 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:06.215548 1707070 cri.go:89] found id: ""
	I1124 09:29:06.215562 1707070 logs.go:282] 0 containers: []
	W1124 09:29:06.215569 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:06.215575 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:06.215630 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:06.239566 1707070 cri.go:89] found id: ""
	I1124 09:29:06.239592 1707070 logs.go:282] 0 containers: []
	W1124 09:29:06.239600 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:06.239605 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:06.239662 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:06.266190 1707070 cri.go:89] found id: ""
	I1124 09:29:06.266223 1707070 logs.go:282] 0 containers: []
	W1124 09:29:06.266232 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:06.266237 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:06.266301 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:06.289910 1707070 cri.go:89] found id: ""
	I1124 09:29:06.289923 1707070 logs.go:282] 0 containers: []
	W1124 09:29:06.289930 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:06.289939 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:06.289955 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:06.353044 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:06.345412   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:06.345855   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:06.347499   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:06.347885   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:06.349511   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:06.345412   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:06.345855   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:06.347499   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:06.347885   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:06.349511   16195 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:06.353054 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:06.353068 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:06.420094 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:06.420114 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:06.452708 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:06.452724 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:06.508689 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:06.508708 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:09.026433 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:09.036862 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:09.036926 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:09.061951 1707070 cri.go:89] found id: ""
	I1124 09:29:09.061965 1707070 logs.go:282] 0 containers: []
	W1124 09:29:09.061972 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:09.061977 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:09.062035 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:09.087954 1707070 cri.go:89] found id: ""
	I1124 09:29:09.087968 1707070 logs.go:282] 0 containers: []
	W1124 09:29:09.087976 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:09.087981 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:09.088044 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:09.112784 1707070 cri.go:89] found id: ""
	I1124 09:29:09.112798 1707070 logs.go:282] 0 containers: []
	W1124 09:29:09.112805 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:09.112810 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:09.112869 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:09.137324 1707070 cri.go:89] found id: ""
	I1124 09:29:09.137339 1707070 logs.go:282] 0 containers: []
	W1124 09:29:09.137347 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:09.137353 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:09.137413 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:09.162408 1707070 cri.go:89] found id: ""
	I1124 09:29:09.162422 1707070 logs.go:282] 0 containers: []
	W1124 09:29:09.162430 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:09.162435 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:09.162513 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:09.191279 1707070 cri.go:89] found id: ""
	I1124 09:29:09.191293 1707070 logs.go:282] 0 containers: []
	W1124 09:29:09.191300 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:09.191305 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:09.191361 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:09.214616 1707070 cri.go:89] found id: ""
	I1124 09:29:09.214630 1707070 logs.go:282] 0 containers: []
	W1124 09:29:09.214637 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:09.214645 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:09.214657 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:09.270146 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:09.270164 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:09.287320 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:09.287340 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:09.352488 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:09.344015   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:09.344642   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:09.346617   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:09.347280   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:09.348952   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:09.344015   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:09.344642   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:09.346617   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:09.347280   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:09.348952   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:09.352499 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:09.352510 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:09.418511 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:09.418532 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:11.954969 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:11.967024 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:11.967089 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:11.990717 1707070 cri.go:89] found id: ""
	I1124 09:29:11.990733 1707070 logs.go:282] 0 containers: []
	W1124 09:29:11.990741 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:11.990746 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:11.990809 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:12.020399 1707070 cri.go:89] found id: ""
	I1124 09:29:12.020413 1707070 logs.go:282] 0 containers: []
	W1124 09:29:12.020421 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:12.020427 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:12.020495 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:12.047081 1707070 cri.go:89] found id: ""
	I1124 09:29:12.047105 1707070 logs.go:282] 0 containers: []
	W1124 09:29:12.047114 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:12.047120 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:12.047185 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:12.072046 1707070 cri.go:89] found id: ""
	I1124 09:29:12.072060 1707070 logs.go:282] 0 containers: []
	W1124 09:29:12.072068 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:12.072074 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:12.072131 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:12.103533 1707070 cri.go:89] found id: ""
	I1124 09:29:12.103547 1707070 logs.go:282] 0 containers: []
	W1124 09:29:12.103554 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:12.103559 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:12.103619 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:12.131885 1707070 cri.go:89] found id: ""
	I1124 09:29:12.131900 1707070 logs.go:282] 0 containers: []
	W1124 09:29:12.131908 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:12.131914 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:12.131977 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:12.156166 1707070 cri.go:89] found id: ""
	I1124 09:29:12.156180 1707070 logs.go:282] 0 containers: []
	W1124 09:29:12.156187 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:12.156195 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:12.156206 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:12.184115 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:12.184131 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:12.239534 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:12.239553 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:12.256920 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:12.256937 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:12.322513 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:12.315053   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:12.315552   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:12.317173   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:12.317659   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:12.319113   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:12.315053   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:12.315552   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:12.317173   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:12.317659   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:12.319113   16419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:12.322536 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:12.322546 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:14.891198 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:14.901386 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:14.901446 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:14.926318 1707070 cri.go:89] found id: ""
	I1124 09:29:14.926340 1707070 logs.go:282] 0 containers: []
	W1124 09:29:14.926347 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:14.926353 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:14.926413 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:14.955083 1707070 cri.go:89] found id: ""
	I1124 09:29:14.955097 1707070 logs.go:282] 0 containers: []
	W1124 09:29:14.955104 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:14.955110 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:14.955167 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:14.979745 1707070 cri.go:89] found id: ""
	I1124 09:29:14.979758 1707070 logs.go:282] 0 containers: []
	W1124 09:29:14.979766 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:14.979771 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:14.979829 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:15.004845 1707070 cri.go:89] found id: ""
	I1124 09:29:15.004861 1707070 logs.go:282] 0 containers: []
	W1124 09:29:15.004869 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:15.004875 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:15.004952 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:15.044211 1707070 cri.go:89] found id: ""
	I1124 09:29:15.044225 1707070 logs.go:282] 0 containers: []
	W1124 09:29:15.044237 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:15.044243 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:15.044330 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:15.075656 1707070 cri.go:89] found id: ""
	I1124 09:29:15.075669 1707070 logs.go:282] 0 containers: []
	W1124 09:29:15.075677 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:15.075682 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:15.075740 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:15.101378 1707070 cri.go:89] found id: ""
	I1124 09:29:15.101392 1707070 logs.go:282] 0 containers: []
	W1124 09:29:15.101400 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:15.101408 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:15.101418 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:15.159297 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:15.159316 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:15.176523 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:15.176541 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:15.242899 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:15.234359   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:15.235294   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:15.237104   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:15.237675   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:15.239169   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:15.234359   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:15.235294   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:15.237104   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:15.237675   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:15.239169   16513 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:15.242909 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:15.242919 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:15.304297 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:15.304319 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:17.833530 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:17.843418 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:17.843476 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:17.867779 1707070 cri.go:89] found id: ""
	I1124 09:29:17.867793 1707070 logs.go:282] 0 containers: []
	W1124 09:29:17.867806 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:17.867811 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:17.867866 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:17.891077 1707070 cri.go:89] found id: ""
	I1124 09:29:17.891090 1707070 logs.go:282] 0 containers: []
	W1124 09:29:17.891098 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:17.891103 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:17.891187 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:17.915275 1707070 cri.go:89] found id: ""
	I1124 09:29:17.915289 1707070 logs.go:282] 0 containers: []
	W1124 09:29:17.915296 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:17.915301 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:17.915357 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:17.943098 1707070 cri.go:89] found id: ""
	I1124 09:29:17.943111 1707070 logs.go:282] 0 containers: []
	W1124 09:29:17.943119 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:17.943124 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:17.943186 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:17.968417 1707070 cri.go:89] found id: ""
	I1124 09:29:17.968430 1707070 logs.go:282] 0 containers: []
	W1124 09:29:17.968437 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:17.968443 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:17.968501 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:17.993301 1707070 cri.go:89] found id: ""
	I1124 09:29:17.993315 1707070 logs.go:282] 0 containers: []
	W1124 09:29:17.993322 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:17.993328 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:17.993385 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:18.021715 1707070 cri.go:89] found id: ""
	I1124 09:29:18.021730 1707070 logs.go:282] 0 containers: []
	W1124 09:29:18.021738 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:18.021746 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:18.021756 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:18.085324 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:18.085345 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:18.118128 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:18.118159 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:18.182148 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:18.182171 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:18.199970 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:18.199990 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:18.266928 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:18.258137   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:18.258818   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:18.260418   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:18.261036   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:18.262678   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:18.258137   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:18.258818   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:18.260418   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:18.261036   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:18.262678   16633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:20.768145 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:20.780890 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:20.780956 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:20.807227 1707070 cri.go:89] found id: ""
	I1124 09:29:20.807241 1707070 logs.go:282] 0 containers: []
	W1124 09:29:20.807248 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:20.807253 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:20.807317 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:20.836452 1707070 cri.go:89] found id: ""
	I1124 09:29:20.836466 1707070 logs.go:282] 0 containers: []
	W1124 09:29:20.836473 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:20.836478 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:20.836535 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:20.861534 1707070 cri.go:89] found id: ""
	I1124 09:29:20.861549 1707070 logs.go:282] 0 containers: []
	W1124 09:29:20.861556 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:20.861561 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:20.861620 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:20.890181 1707070 cri.go:89] found id: ""
	I1124 09:29:20.890196 1707070 logs.go:282] 0 containers: []
	W1124 09:29:20.890203 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:20.890209 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:20.890278 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:20.919882 1707070 cri.go:89] found id: ""
	I1124 09:29:20.919897 1707070 logs.go:282] 0 containers: []
	W1124 09:29:20.919904 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:20.919910 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:20.919973 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:20.948347 1707070 cri.go:89] found id: ""
	I1124 09:29:20.948361 1707070 logs.go:282] 0 containers: []
	W1124 09:29:20.948368 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:20.948373 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:20.948428 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:20.972834 1707070 cri.go:89] found id: ""
	I1124 09:29:20.972847 1707070 logs.go:282] 0 containers: []
	W1124 09:29:20.972855 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:20.972862 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:20.972873 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:21.029330 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:21.029350 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:21.046983 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:21.047000 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:21.112004 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:21.104171   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:21.104918   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:21.106573   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:21.107127   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:21.108653   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:21.104171   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:21.104918   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:21.106573   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:21.107127   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:21.108653   16728 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:21.112015 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:21.112025 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:21.174850 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:21.174870 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:23.702609 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:23.712856 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:23.712939 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:23.741964 1707070 cri.go:89] found id: ""
	I1124 09:29:23.741978 1707070 logs.go:282] 0 containers: []
	W1124 09:29:23.741985 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:23.741991 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:23.742067 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:23.766952 1707070 cri.go:89] found id: ""
	I1124 09:29:23.766966 1707070 logs.go:282] 0 containers: []
	W1124 09:29:23.766972 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:23.766978 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:23.767035 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:23.790992 1707070 cri.go:89] found id: ""
	I1124 09:29:23.791005 1707070 logs.go:282] 0 containers: []
	W1124 09:29:23.791013 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:23.791018 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:23.791073 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:23.819700 1707070 cri.go:89] found id: ""
	I1124 09:29:23.819713 1707070 logs.go:282] 0 containers: []
	W1124 09:29:23.819720 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:23.819726 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:23.819786 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:23.848657 1707070 cri.go:89] found id: ""
	I1124 09:29:23.848683 1707070 logs.go:282] 0 containers: []
	W1124 09:29:23.848690 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:23.848695 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:23.848754 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:23.873546 1707070 cri.go:89] found id: ""
	I1124 09:29:23.873571 1707070 logs.go:282] 0 containers: []
	W1124 09:29:23.873578 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:23.873584 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:23.873654 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:23.899519 1707070 cri.go:89] found id: ""
	I1124 09:29:23.899533 1707070 logs.go:282] 0 containers: []
	W1124 09:29:23.899547 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:23.899556 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:23.899568 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:23.954834 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:23.954854 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:23.971662 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:23.971680 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:24.041660 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:24.033560   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:24.034352   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:24.036062   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:24.036417   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:24.038032   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:24.033560   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:24.034352   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:24.036062   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:24.036417   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:24.038032   16835 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:24.041670 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:24.041681 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:24.105146 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:24.105168 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:26.634760 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:26.646166 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:26.646251 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:26.679257 1707070 cri.go:89] found id: ""
	I1124 09:29:26.679271 1707070 logs.go:282] 0 containers: []
	W1124 09:29:26.679279 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:26.679284 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:26.679344 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:26.710754 1707070 cri.go:89] found id: ""
	I1124 09:29:26.710768 1707070 logs.go:282] 0 containers: []
	W1124 09:29:26.710775 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:26.710782 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:26.710840 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:26.735831 1707070 cri.go:89] found id: ""
	I1124 09:29:26.735845 1707070 logs.go:282] 0 containers: []
	W1124 09:29:26.735852 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:26.735857 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:26.735926 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:26.759918 1707070 cri.go:89] found id: ""
	I1124 09:29:26.759932 1707070 logs.go:282] 0 containers: []
	W1124 09:29:26.759939 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:26.759947 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:26.760002 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:26.783806 1707070 cri.go:89] found id: ""
	I1124 09:29:26.783825 1707070 logs.go:282] 0 containers: []
	W1124 09:29:26.783832 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:26.783838 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:26.783895 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:26.809230 1707070 cri.go:89] found id: ""
	I1124 09:29:26.809244 1707070 logs.go:282] 0 containers: []
	W1124 09:29:26.809252 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:26.809266 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:26.809331 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:26.836902 1707070 cri.go:89] found id: ""
	I1124 09:29:26.836916 1707070 logs.go:282] 0 containers: []
	W1124 09:29:26.836923 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:26.836931 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:26.836942 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:26.853955 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:26.853978 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:26.916186 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:26.907929   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:26.908672   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:26.910345   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:26.910937   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:26.912681   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:26.907929   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:26.908672   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:26.910345   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:26.910937   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:26.912681   16937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:26.916196 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:26.916218 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:26.980050 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:26.980072 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:27.010821 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:27.010838 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:29.573482 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:29.583518 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:29.583582 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:29.608188 1707070 cri.go:89] found id: ""
	I1124 09:29:29.608202 1707070 logs.go:282] 0 containers: []
	W1124 09:29:29.608209 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:29.608214 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:29.608270 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:29.641187 1707070 cri.go:89] found id: ""
	I1124 09:29:29.641201 1707070 logs.go:282] 0 containers: []
	W1124 09:29:29.641209 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:29.641214 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:29.641282 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:29.674249 1707070 cri.go:89] found id: ""
	I1124 09:29:29.674269 1707070 logs.go:282] 0 containers: []
	W1124 09:29:29.674276 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:29.674282 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:29.674339 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:29.700355 1707070 cri.go:89] found id: ""
	I1124 09:29:29.700370 1707070 logs.go:282] 0 containers: []
	W1124 09:29:29.700377 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:29.700382 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:29.700438 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:29.729232 1707070 cri.go:89] found id: ""
	I1124 09:29:29.729246 1707070 logs.go:282] 0 containers: []
	W1124 09:29:29.729253 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:29.729257 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:29.729313 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:29.756753 1707070 cri.go:89] found id: ""
	I1124 09:29:29.756766 1707070 logs.go:282] 0 containers: []
	W1124 09:29:29.756773 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:29.756788 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:29.756849 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:29.782318 1707070 cri.go:89] found id: ""
	I1124 09:29:29.782332 1707070 logs.go:282] 0 containers: []
	W1124 09:29:29.782339 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:29.782347 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:29.782358 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:29.837944 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:29.837963 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:29.855075 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:29.855094 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:29.916212 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:29.907972   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:29.908745   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:29.910447   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:29.910972   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:29.912670   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:29.907972   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:29.908745   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:29.910447   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:29.910972   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:29.912670   17045 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:29.916221 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:29.916232 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:29.978681 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:29.978703 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:32.530833 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:32.541146 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:32.541251 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:32.566525 1707070 cri.go:89] found id: ""
	I1124 09:29:32.566540 1707070 logs.go:282] 0 containers: []
	W1124 09:29:32.566548 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:32.566554 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:32.566622 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:32.591741 1707070 cri.go:89] found id: ""
	I1124 09:29:32.591756 1707070 logs.go:282] 0 containers: []
	W1124 09:29:32.591763 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:32.591768 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:32.591826 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:32.617127 1707070 cri.go:89] found id: ""
	I1124 09:29:32.617141 1707070 logs.go:282] 0 containers: []
	W1124 09:29:32.617148 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:32.617153 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:32.617209 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:32.654493 1707070 cri.go:89] found id: ""
	I1124 09:29:32.654507 1707070 logs.go:282] 0 containers: []
	W1124 09:29:32.654515 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:32.654521 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:32.654580 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:32.685080 1707070 cri.go:89] found id: ""
	I1124 09:29:32.685094 1707070 logs.go:282] 0 containers: []
	W1124 09:29:32.685101 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:32.685106 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:32.685180 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:32.715751 1707070 cri.go:89] found id: ""
	I1124 09:29:32.715766 1707070 logs.go:282] 0 containers: []
	W1124 09:29:32.715782 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:32.715788 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:32.715850 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:32.742395 1707070 cri.go:89] found id: ""
	I1124 09:29:32.742409 1707070 logs.go:282] 0 containers: []
	W1124 09:29:32.742416 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:32.742424 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:32.742434 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:32.760261 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:32.760278 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:32.828736 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:32.819577   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:32.820328   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:32.822013   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:32.822622   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:32.824506   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:32.819577   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:32.820328   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:32.822013   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:32.822622   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:32.824506   17149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:32.828746 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:32.828759 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:32.896940 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:32.896965 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:32.928695 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:32.928711 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:35.485941 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:35.496873 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:35.496934 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:35.525748 1707070 cri.go:89] found id: ""
	I1124 09:29:35.525782 1707070 logs.go:282] 0 containers: []
	W1124 09:29:35.525791 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:35.525796 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:35.525866 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:35.553111 1707070 cri.go:89] found id: ""
	I1124 09:29:35.553126 1707070 logs.go:282] 0 containers: []
	W1124 09:29:35.553134 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:35.553142 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:35.553220 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:35.578594 1707070 cri.go:89] found id: ""
	I1124 09:29:35.578622 1707070 logs.go:282] 0 containers: []
	W1124 09:29:35.578629 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:35.578635 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:35.578706 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:35.607322 1707070 cri.go:89] found id: ""
	I1124 09:29:35.607336 1707070 logs.go:282] 0 containers: []
	W1124 09:29:35.607343 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:35.607348 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:35.607417 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:35.638865 1707070 cri.go:89] found id: ""
	I1124 09:29:35.638880 1707070 logs.go:282] 0 containers: []
	W1124 09:29:35.638887 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:35.638893 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:35.638960 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:35.672327 1707070 cri.go:89] found id: ""
	I1124 09:29:35.672352 1707070 logs.go:282] 0 containers: []
	W1124 09:29:35.672360 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:35.672365 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:35.672431 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:35.700255 1707070 cri.go:89] found id: ""
	I1124 09:29:35.700269 1707070 logs.go:282] 0 containers: []
	W1124 09:29:35.700277 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:35.700285 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:35.700297 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:35.758017 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:35.758037 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:35.775326 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:35.775344 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:35.842090 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:35.833688   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:35.834400   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:35.836148   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:35.836802   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:35.838521   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:35.833688   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:35.834400   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:35.836148   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:35.836802   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:35.838521   17255 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:35.842100 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:35.842120 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:35.908742 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:35.908769 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:38.443689 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:38.453968 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:38.454035 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:38.477762 1707070 cri.go:89] found id: ""
	I1124 09:29:38.477776 1707070 logs.go:282] 0 containers: []
	W1124 09:29:38.477783 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:38.477789 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:38.477853 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:38.506120 1707070 cri.go:89] found id: ""
	I1124 09:29:38.506134 1707070 logs.go:282] 0 containers: []
	W1124 09:29:38.506141 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:38.506147 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:38.506203 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:38.530669 1707070 cri.go:89] found id: ""
	I1124 09:29:38.530691 1707070 logs.go:282] 0 containers: []
	W1124 09:29:38.530699 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:38.530705 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:38.530763 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:38.560535 1707070 cri.go:89] found id: ""
	I1124 09:29:38.560558 1707070 logs.go:282] 0 containers: []
	W1124 09:29:38.560565 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:38.560572 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:38.560631 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:38.586535 1707070 cri.go:89] found id: ""
	I1124 09:29:38.586549 1707070 logs.go:282] 0 containers: []
	W1124 09:29:38.586556 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:38.586561 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:38.586620 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:38.611101 1707070 cri.go:89] found id: ""
	I1124 09:29:38.611115 1707070 logs.go:282] 0 containers: []
	W1124 09:29:38.611122 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:38.611127 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:38.611186 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:38.643467 1707070 cri.go:89] found id: ""
	I1124 09:29:38.643482 1707070 logs.go:282] 0 containers: []
	W1124 09:29:38.643489 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:38.643497 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:38.643508 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:38.708197 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:38.708218 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:38.725978 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:38.725995 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:38.789806 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:38.781672   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:38.782397   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:38.783993   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:38.784577   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:38.786135   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:38.781672   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:38.782397   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:38.783993   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:38.784577   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:38.786135   17359 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:38.789818 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:38.789828 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:38.853085 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:38.853106 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:41.387044 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:41.398117 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:41.398183 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:41.424537 1707070 cri.go:89] found id: ""
	I1124 09:29:41.424551 1707070 logs.go:282] 0 containers: []
	W1124 09:29:41.424558 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:41.424564 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:41.424626 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:41.454716 1707070 cri.go:89] found id: ""
	I1124 09:29:41.454730 1707070 logs.go:282] 0 containers: []
	W1124 09:29:41.454737 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:41.454742 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:41.454801 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:41.479954 1707070 cri.go:89] found id: ""
	I1124 09:29:41.479969 1707070 logs.go:282] 0 containers: []
	W1124 09:29:41.479976 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:41.479981 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:41.480041 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:41.505560 1707070 cri.go:89] found id: ""
	I1124 09:29:41.505575 1707070 logs.go:282] 0 containers: []
	W1124 09:29:41.505582 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:41.505593 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:41.505654 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:41.530996 1707070 cri.go:89] found id: ""
	I1124 09:29:41.531010 1707070 logs.go:282] 0 containers: []
	W1124 09:29:41.531018 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:41.531024 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:41.531090 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:41.557489 1707070 cri.go:89] found id: ""
	I1124 09:29:41.557502 1707070 logs.go:282] 0 containers: []
	W1124 09:29:41.557510 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:41.557516 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:41.557575 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:41.587178 1707070 cri.go:89] found id: ""
	I1124 09:29:41.587192 1707070 logs.go:282] 0 containers: []
	W1124 09:29:41.587199 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:41.587207 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:41.587217 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:41.644853 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:41.644873 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:41.664905 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:41.664924 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:41.731530 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:41.723947   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:41.724430   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:41.726128   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:41.726440   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:41.727892   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:41.723947   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:41.724430   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:41.726128   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:41.726440   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:41.727892   17464 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:41.731540 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:41.731550 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:41.793965 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:41.793985 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:44.323959 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:44.334291 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:44.334352 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:44.364183 1707070 cri.go:89] found id: ""
	I1124 09:29:44.364199 1707070 logs.go:282] 0 containers: []
	W1124 09:29:44.364206 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:44.364212 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:44.364285 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:44.391116 1707070 cri.go:89] found id: ""
	I1124 09:29:44.391130 1707070 logs.go:282] 0 containers: []
	W1124 09:29:44.391137 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:44.391142 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:44.391199 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:44.416448 1707070 cri.go:89] found id: ""
	I1124 09:29:44.416462 1707070 logs.go:282] 0 containers: []
	W1124 09:29:44.416470 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:44.416476 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:44.416533 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:44.442027 1707070 cri.go:89] found id: ""
	I1124 09:29:44.442042 1707070 logs.go:282] 0 containers: []
	W1124 09:29:44.442059 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:44.442065 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:44.442124 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:44.467492 1707070 cri.go:89] found id: ""
	I1124 09:29:44.467516 1707070 logs.go:282] 0 containers: []
	W1124 09:29:44.467525 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:44.467531 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:44.467643 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:44.492900 1707070 cri.go:89] found id: ""
	I1124 09:29:44.492914 1707070 logs.go:282] 0 containers: []
	W1124 09:29:44.492921 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:44.492927 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:44.492986 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:44.518419 1707070 cri.go:89] found id: ""
	I1124 09:29:44.518434 1707070 logs.go:282] 0 containers: []
	W1124 09:29:44.518441 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:44.518449 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:44.518479 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:44.584407 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:44.584427 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:44.616287 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:44.616305 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:44.680013 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:44.680033 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:44.702644 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:44.702662 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:44.770803 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:44.761924   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:44.762682   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:44.764417   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:44.765036   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:44.766673   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:44.761924   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:44.762682   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:44.764417   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:44.765036   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:44.766673   17579 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:47.271699 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:47.283580 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:47.283646 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:47.309341 1707070 cri.go:89] found id: ""
	I1124 09:29:47.309355 1707070 logs.go:282] 0 containers: []
	W1124 09:29:47.309368 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:47.309385 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:47.309443 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:47.335187 1707070 cri.go:89] found id: ""
	I1124 09:29:47.335202 1707070 logs.go:282] 0 containers: []
	W1124 09:29:47.335209 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:47.335214 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:47.335273 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:47.362876 1707070 cri.go:89] found id: ""
	I1124 09:29:47.362891 1707070 logs.go:282] 0 containers: []
	W1124 09:29:47.362898 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:47.362904 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:47.362964 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:47.388290 1707070 cri.go:89] found id: ""
	I1124 09:29:47.388304 1707070 logs.go:282] 0 containers: []
	W1124 09:29:47.388311 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:47.388317 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:47.388374 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:47.416544 1707070 cri.go:89] found id: ""
	I1124 09:29:47.416558 1707070 logs.go:282] 0 containers: []
	W1124 09:29:47.416565 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:47.416570 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:47.416629 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:47.441861 1707070 cri.go:89] found id: ""
	I1124 09:29:47.441875 1707070 logs.go:282] 0 containers: []
	W1124 09:29:47.441902 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:47.441909 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:47.441978 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:47.465857 1707070 cri.go:89] found id: ""
	I1124 09:29:47.465879 1707070 logs.go:282] 0 containers: []
	W1124 09:29:47.465886 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:47.465894 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:47.465905 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:47.523429 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:47.523450 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:47.540445 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:47.540462 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:47.607683 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:47.599524   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:47.600165   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:47.601865   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:47.602402   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:47.603965   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:47.599524   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:47.600165   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:47.601865   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:47.602402   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:47.603965   17667 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:47.607694 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:47.607704 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:47.682000 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:47.682023 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:50.218599 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:50.229182 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:29:50.229254 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:29:50.254129 1707070 cri.go:89] found id: ""
	I1124 09:29:50.254143 1707070 logs.go:282] 0 containers: []
	W1124 09:29:50.254150 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:29:50.254155 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:29:50.254219 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:29:50.280233 1707070 cri.go:89] found id: ""
	I1124 09:29:50.280247 1707070 logs.go:282] 0 containers: []
	W1124 09:29:50.280254 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:29:50.280260 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:29:50.280317 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:29:50.304403 1707070 cri.go:89] found id: ""
	I1124 09:29:50.304417 1707070 logs.go:282] 0 containers: []
	W1124 09:29:50.304424 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:29:50.304430 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:29:50.304492 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:29:50.329881 1707070 cri.go:89] found id: ""
	I1124 09:29:50.329897 1707070 logs.go:282] 0 containers: []
	W1124 09:29:50.329904 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:29:50.329910 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:29:50.329987 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:29:50.358124 1707070 cri.go:89] found id: ""
	I1124 09:29:50.358139 1707070 logs.go:282] 0 containers: []
	W1124 09:29:50.358149 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:29:50.358158 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:29:50.358246 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:29:50.384151 1707070 cri.go:89] found id: ""
	I1124 09:29:50.384165 1707070 logs.go:282] 0 containers: []
	W1124 09:29:50.384178 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:29:50.384196 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:29:50.384254 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:29:50.408884 1707070 cri.go:89] found id: ""
	I1124 09:29:50.408899 1707070 logs.go:282] 0 containers: []
	W1124 09:29:50.408906 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:29:50.408914 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:29:50.408925 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 09:29:50.464122 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:29:50.464147 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:29:50.480720 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:29:50.480736 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:29:50.544337 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:29:50.536334   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:50.536956   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:50.538555   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:50.539042   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:50.540634   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:29:50.536334   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:50.536956   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:50.538555   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:50.539042   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:29:50.540634   17772 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:29:50.544348 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:29:50.544361 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:29:50.606972 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:29:50.606993 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:29:53.143446 1707070 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:29:53.154359 1707070 kubeadm.go:602] duration metric: took 4m4.065975367s to restartPrimaryControlPlane
	W1124 09:29:53.154423 1707070 out.go:285] ! Unable to restart control-plane node(s), will reset cluster: <no value>
	I1124 09:29:53.154529 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1124 09:29:53.563147 1707070 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1124 09:29:53.576942 1707070 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1124 09:29:53.584698 1707070 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1124 09:29:53.584758 1707070 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1124 09:29:53.592605 1707070 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1124 09:29:53.592613 1707070 kubeadm.go:158] found existing configuration files:
	
	I1124 09:29:53.592678 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1124 09:29:53.600460 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1124 09:29:53.600517 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1124 09:29:53.607615 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1124 09:29:53.615236 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1124 09:29:53.615293 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1124 09:29:53.622532 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1124 09:29:53.630501 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1124 09:29:53.630562 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1124 09:29:53.638386 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1124 09:29:53.646257 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1124 09:29:53.646321 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1124 09:29:53.653836 1707070 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1124 09:29:53.692708 1707070 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-beta.0
	I1124 09:29:53.692756 1707070 kubeadm.go:319] [preflight] Running pre-flight checks
	I1124 09:29:53.765347 1707070 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1124 09:29:53.765413 1707070 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1124 09:29:53.765447 1707070 kubeadm.go:319] OS: Linux
	I1124 09:29:53.765490 1707070 kubeadm.go:319] CGROUPS_CPU: enabled
	I1124 09:29:53.765537 1707070 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1124 09:29:53.765589 1707070 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1124 09:29:53.765636 1707070 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1124 09:29:53.765682 1707070 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1124 09:29:53.765729 1707070 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1124 09:29:53.765772 1707070 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1124 09:29:53.765819 1707070 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1124 09:29:53.765864 1707070 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1124 09:29:53.828877 1707070 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1124 09:29:53.829001 1707070 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1124 09:29:53.829104 1707070 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1124 09:29:53.834791 1707070 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1124 09:29:53.838245 1707070 out.go:252]   - Generating certificates and keys ...
	I1124 09:29:53.838369 1707070 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1124 09:29:53.838434 1707070 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1124 09:29:53.838527 1707070 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1124 09:29:53.838616 1707070 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1124 09:29:53.838701 1707070 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1124 09:29:53.838784 1707070 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1124 09:29:53.838854 1707070 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1124 09:29:53.838919 1707070 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1124 09:29:53.839002 1707070 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1124 09:29:53.839386 1707070 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1124 09:29:53.839639 1707070 kubeadm.go:319] [certs] Using the existing "sa" key
	I1124 09:29:53.839706 1707070 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1124 09:29:54.545063 1707070 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1124 09:29:55.036514 1707070 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1124 09:29:55.148786 1707070 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1124 09:29:55.311399 1707070 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1124 09:29:55.656188 1707070 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1124 09:29:55.656996 1707070 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1124 09:29:55.659590 1707070 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1124 09:29:55.662658 1707070 out.go:252]   - Booting up control plane ...
	I1124 09:29:55.662786 1707070 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1124 09:29:55.662870 1707070 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1124 09:29:55.664747 1707070 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1124 09:29:55.686536 1707070 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1124 09:29:55.686657 1707070 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1124 09:29:55.694440 1707070 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1124 09:29:55.694885 1707070 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1124 09:29:55.694934 1707070 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1124 09:29:55.830944 1707070 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1124 09:29:55.831051 1707070 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1124 09:33:55.829210 1707070 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000251849s
	I1124 09:33:55.829235 1707070 kubeadm.go:319] 
	I1124 09:33:55.829291 1707070 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1124 09:33:55.829323 1707070 kubeadm.go:319] 	- The kubelet is not running
	I1124 09:33:55.829428 1707070 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1124 09:33:55.829432 1707070 kubeadm.go:319] 
	I1124 09:33:55.829536 1707070 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1124 09:33:55.829573 1707070 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1124 09:33:55.829603 1707070 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1124 09:33:55.829606 1707070 kubeadm.go:319] 
	I1124 09:33:55.833661 1707070 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1124 09:33:55.834099 1707070 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1124 09:33:55.834220 1707070 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1124 09:33:55.834508 1707070 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1124 09:33:55.834517 1707070 kubeadm.go:319] 
	I1124 09:33:55.834670 1707070 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	W1124 09:33:55.834735 1707070 out.go:285] ! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000251849s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	I1124 09:33:55.834825 1707070 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1124 09:33:56.243415 1707070 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1124 09:33:56.256462 1707070 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1124 09:33:56.256517 1707070 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1124 09:33:56.264387 1707070 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1124 09:33:56.264397 1707070 kubeadm.go:158] found existing configuration files:
	
	I1124 09:33:56.264448 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1124 09:33:56.272152 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1124 09:33:56.272210 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1124 09:33:56.279938 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1124 09:33:56.287667 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1124 09:33:56.287720 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1124 09:33:56.295096 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1124 09:33:56.302699 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1124 09:33:56.302758 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1124 09:33:56.310421 1707070 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1124 09:33:56.318128 1707070 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1124 09:33:56.318183 1707070 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1124 09:33:56.325438 1707070 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1124 09:33:56.364513 1707070 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-beta.0
	I1124 09:33:56.364563 1707070 kubeadm.go:319] [preflight] Running pre-flight checks
	I1124 09:33:56.440273 1707070 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1124 09:33:56.440340 1707070 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1124 09:33:56.440376 1707070 kubeadm.go:319] OS: Linux
	I1124 09:33:56.440420 1707070 kubeadm.go:319] CGROUPS_CPU: enabled
	I1124 09:33:56.440467 1707070 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1124 09:33:56.440513 1707070 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1124 09:33:56.440560 1707070 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1124 09:33:56.440606 1707070 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1124 09:33:56.440654 1707070 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1124 09:33:56.440697 1707070 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1124 09:33:56.440749 1707070 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1124 09:33:56.440794 1707070 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1124 09:33:56.504487 1707070 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1124 09:33:56.504590 1707070 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1124 09:33:56.504685 1707070 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1124 09:33:56.510220 1707070 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1124 09:33:56.513847 1707070 out.go:252]   - Generating certificates and keys ...
	I1124 09:33:56.513936 1707070 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1124 09:33:56.514003 1707070 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1124 09:33:56.514078 1707070 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1124 09:33:56.514137 1707070 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1124 09:33:56.514205 1707070 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1124 09:33:56.514264 1707070 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1124 09:33:56.514326 1707070 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1124 09:33:56.514386 1707070 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1124 09:33:56.514481 1707070 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1124 09:33:56.514553 1707070 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1124 09:33:56.514589 1707070 kubeadm.go:319] [certs] Using the existing "sa" key
	I1124 09:33:56.514644 1707070 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1124 09:33:57.046366 1707070 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1124 09:33:57.432965 1707070 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1124 09:33:57.802873 1707070 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1124 09:33:58.414576 1707070 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1124 09:33:58.520825 1707070 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1124 09:33:58.522049 1707070 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1124 09:33:58.526436 1707070 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1124 09:33:58.529676 1707070 out.go:252]   - Booting up control plane ...
	I1124 09:33:58.529779 1707070 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1124 09:33:58.529855 1707070 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1124 09:33:58.529921 1707070 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1124 09:33:58.549683 1707070 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1124 09:33:58.549801 1707070 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1124 09:33:58.557327 1707070 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1124 09:33:58.557589 1707070 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1124 09:33:58.557812 1707070 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1124 09:33:58.696439 1707070 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1124 09:33:58.696553 1707070 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1124 09:37:58.697446 1707070 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.001230859s
	I1124 09:37:58.697472 1707070 kubeadm.go:319] 
	I1124 09:37:58.697558 1707070 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1124 09:37:58.697602 1707070 kubeadm.go:319] 	- The kubelet is not running
	I1124 09:37:58.697730 1707070 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1124 09:37:58.697737 1707070 kubeadm.go:319] 
	I1124 09:37:58.697847 1707070 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1124 09:37:58.697878 1707070 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1124 09:37:58.697921 1707070 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1124 09:37:58.697925 1707070 kubeadm.go:319] 
	I1124 09:37:58.701577 1707070 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1124 09:37:58.701990 1707070 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1124 09:37:58.702104 1707070 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1124 09:37:58.702344 1707070 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1124 09:37:58.702350 1707070 kubeadm.go:319] 
	I1124 09:37:58.702417 1707070 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1124 09:37:58.702481 1707070 kubeadm.go:403] duration metric: took 12m9.652556415s to StartCluster
	I1124 09:37:58.702514 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 09:37:58.702578 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 09:37:58.726968 1707070 cri.go:89] found id: ""
	I1124 09:37:58.726981 1707070 logs.go:282] 0 containers: []
	W1124 09:37:58.726988 1707070 logs.go:284] No container was found matching "kube-apiserver"
	I1124 09:37:58.726994 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 09:37:58.727055 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 09:37:58.756184 1707070 cri.go:89] found id: ""
	I1124 09:37:58.756198 1707070 logs.go:282] 0 containers: []
	W1124 09:37:58.756205 1707070 logs.go:284] No container was found matching "etcd"
	I1124 09:37:58.756210 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 09:37:58.756266 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 09:37:58.781056 1707070 cri.go:89] found id: ""
	I1124 09:37:58.781070 1707070 logs.go:282] 0 containers: []
	W1124 09:37:58.781077 1707070 logs.go:284] No container was found matching "coredns"
	I1124 09:37:58.781082 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 09:37:58.781145 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 09:37:58.805769 1707070 cri.go:89] found id: ""
	I1124 09:37:58.805783 1707070 logs.go:282] 0 containers: []
	W1124 09:37:58.805790 1707070 logs.go:284] No container was found matching "kube-scheduler"
	I1124 09:37:58.805796 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 09:37:58.805854 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 09:37:58.830758 1707070 cri.go:89] found id: ""
	I1124 09:37:58.830780 1707070 logs.go:282] 0 containers: []
	W1124 09:37:58.830791 1707070 logs.go:284] No container was found matching "kube-proxy"
	I1124 09:37:58.830797 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 09:37:58.830857 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 09:37:58.855967 1707070 cri.go:89] found id: ""
	I1124 09:37:58.855981 1707070 logs.go:282] 0 containers: []
	W1124 09:37:58.855988 1707070 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 09:37:58.855994 1707070 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 09:37:58.856051 1707070 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 09:37:58.890842 1707070 cri.go:89] found id: ""
	I1124 09:37:58.890857 1707070 logs.go:282] 0 containers: []
	W1124 09:37:58.890865 1707070 logs.go:284] No container was found matching "kindnet"
	I1124 09:37:58.890873 1707070 logs.go:123] Gathering logs for dmesg ...
	I1124 09:37:58.890885 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 09:37:58.910142 1707070 logs.go:123] Gathering logs for describe nodes ...
	I1124 09:37:58.910157 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 09:37:58.985463 1707070 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:37:58.976283   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:37:58.977104   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:37:58.978904   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:37:58.979496   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:37:58.981268   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1124 09:37:58.976283   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:37:58.977104   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:37:58.978904   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:37:58.979496   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:37:58.981268   21584 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 09:37:58.985474 1707070 logs.go:123] Gathering logs for containerd ...
	I1124 09:37:58.985486 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 09:37:59.051823 1707070 logs.go:123] Gathering logs for container status ...
	I1124 09:37:59.051845 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 09:37:59.080123 1707070 logs.go:123] Gathering logs for kubelet ...
	I1124 09:37:59.080139 1707070 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	W1124 09:37:59.137954 1707070 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001230859s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	W1124 09:37:59.138000 1707070 out.go:285] * 
	W1124 09:37:59.138117 1707070 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001230859s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1124 09:37:59.138177 1707070 out.go:285] * 
	W1124 09:37:59.140306 1707070 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1124 09:37:59.145839 1707070 out.go:203] 
	W1124 09:37:59.149636 1707070 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001230859s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1124 09:37:59.149678 1707070 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1124 09:37:59.149707 1707070 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1124 09:37:59.153358 1707070 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.269381066Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.269475385Z" level=info msg="Connect containerd service"
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.269860021Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this"
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.270611232Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config"
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.281475104Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.281548105Z" level=info msg=serving... address=/run/containerd/containerd.sock
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.281591536Z" level=info msg="Start subscribing containerd event"
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.281638691Z" level=info msg="Start recovering state"
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.310177719Z" level=info msg="Start event monitor"
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.310369614Z" level=info msg="Start cni network conf syncer for default"
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.310437783Z" level=info msg="Start streaming server"
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.310546157Z" level=info msg="Registered namespace \"k8s.io\" with NRI"
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.310605341Z" level=info msg="runtime interface starting up..."
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.310661563Z" level=info msg="starting plugins..."
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.310723160Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Nov 24 09:25:47 functional-291288 systemd[1]: Started containerd.service - containerd container runtime.
	Nov 24 09:25:47 functional-291288 containerd[10324]: time="2025-11-24T09:25:47.312804699Z" level=info msg="containerd successfully booted in 0.067611s"
	Nov 24 09:38:08 functional-291288 containerd[10324]: time="2025-11-24T09:38:08.727637230Z" level=info msg="No images store for sha256:af1a838d2702e4e84137a83a66ae93ebb59c7bf115bf022cc84ce1a55dfd3fb4"
	Nov 24 09:38:08 functional-291288 containerd[10324]: time="2025-11-24T09:38:08.730615829Z" level=info msg="ImageCreate event name:\"docker.io/kicbase/echo-server:functional-291288\""
	Nov 24 09:38:08 functional-291288 containerd[10324]: time="2025-11-24T09:38:08.741554707Z" level=info msg="ImageCreate event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:38:08 functional-291288 containerd[10324]: time="2025-11-24T09:38:08.742098573Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-291288\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 09:38:09 functional-291288 containerd[10324]: time="2025-11-24T09:38:09.725115511Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-291288\""
	Nov 24 09:38:09 functional-291288 containerd[10324]: time="2025-11-24T09:38:09.727769003Z" level=info msg="ImageDelete event name:\"docker.io/kicbase/echo-server:functional-291288\""
	Nov 24 09:38:09 functional-291288 containerd[10324]: time="2025-11-24T09:38:09.729992993Z" level=info msg="ImageDelete event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\""
	Nov 24 09:38:09 functional-291288 containerd[10324]: time="2025-11-24T09:38:09.740634129Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-291288\" returns successfully"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1124 09:38:09.854353   22385 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:38:09.855078   22385 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:38:09.856256   22385 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:38:09.856777   22385 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1124 09:38:09.858347   22385 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Nov24 08:20] overlayfs: idmapped layers are currently not supported
	[Nov24 08:21] overlayfs: idmapped layers are currently not supported
	[Nov24 08:22] overlayfs: idmapped layers are currently not supported
	[Nov24 08:23] overlayfs: idmapped layers are currently not supported
	[Nov24 08:24] overlayfs: idmapped layers are currently not supported
	[ +19.566509] overlayfs: idmapped layers are currently not supported
	[Nov24 08:25] overlayfs: idmapped layers are currently not supported
	[ +35.934095] overlayfs: idmapped layers are currently not supported
	[Nov24 08:27] overlayfs: idmapped layers are currently not supported
	[Nov24 08:28] overlayfs: idmapped layers are currently not supported
	[Nov24 08:29] overlayfs: idmapped layers are currently not supported
	[Nov24 08:30] overlayfs: idmapped layers are currently not supported
	[Nov24 08:31] overlayfs: idmapped layers are currently not supported
	[ +44.505180] overlayfs: idmapped layers are currently not supported
	[Nov24 08:32] overlayfs: idmapped layers are currently not supported
	[Nov24 08:33] overlayfs: idmapped layers are currently not supported
	[Nov24 08:34] overlayfs: idmapped layers are currently not supported
	[ +21.137877] overlayfs: idmapped layers are currently not supported
	[ +28.724175] overlayfs: idmapped layers are currently not supported
	[Nov24 08:36] overlayfs: idmapped layers are currently not supported
	[Nov24 08:37] overlayfs: idmapped layers are currently not supported
	[Nov24 08:38] overlayfs: idmapped layers are currently not supported
	[  +4.063834] overlayfs: idmapped layers are currently not supported
	[Nov24 08:40] overlayfs: idmapped layers are currently not supported
	[Nov24 08:42] kauditd_printk_skb: 8 callbacks suppressed
	
	
	==> kernel <==
	 09:38:09 up  8:20,  0 user,  load average: 0.19, 0.17, 0.32
	Linux functional-291288 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Nov 24 09:38:06 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:38:07 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 331.
	Nov 24 09:38:07 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:38:07 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:38:07 functional-291288 kubelet[22123]: E1124 09:38:07.221258   22123 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:38:07 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:38:07 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:38:07 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 332.
	Nov 24 09:38:07 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:38:07 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:38:07 functional-291288 kubelet[22161]: E1124 09:38:07.956843   22161 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:38:07 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:38:07 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:38:08 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 333.
	Nov 24 09:38:08 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:38:08 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:38:08 functional-291288 kubelet[22251]: E1124 09:38:08.694725   22251 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:38:08 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:38:08 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 09:38:09 functional-291288 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 334.
	Nov 24 09:38:09 functional-291288 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:38:09 functional-291288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 09:38:09 functional-291288 kubelet[22279]: E1124 09:38:09.444570   22279 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 09:38:09 functional-291288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 09:38:09 functional-291288 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-291288 -n functional-291288
helpers_test.go:262: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-291288 -n functional-291288: exit status 2 (462.827571ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:262: status error: exit status 2 (may be ok)
helpers_test.go:264: "functional-291288" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NodeLabels (3.24s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/RunSecondTunnel (0.47s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/RunSecondTunnel
functional_test_tunnel_test.go:154: (dbg) daemon: [out/minikube-linux-arm64 -p functional-291288 tunnel --alsologtostderr]
functional_test_tunnel_test.go:154: (dbg) daemon: [out/minikube-linux-arm64 -p functional-291288 tunnel --alsologtostderr]
functional_test_tunnel_test.go:190: tunnel command failed with unexpected error: exit code 103. stderr: I1124 09:38:14.558760 1721889 out.go:360] Setting OutFile to fd 1 ...
I1124 09:38:14.559002 1721889 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1124 09:38:14.559031 1721889 out.go:374] Setting ErrFile to fd 2...
I1124 09:38:14.559055 1721889 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1124 09:38:14.559344 1721889 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
I1124 09:38:14.559682 1721889 mustload.go:66] Loading cluster: functional-291288
I1124 09:38:14.561777 1721889 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
I1124 09:38:14.562608 1721889 cli_runner.go:164] Run: docker container inspect functional-291288 --format={{.State.Status}}
I1124 09:38:14.587215 1721889 host.go:66] Checking if "functional-291288" exists ...
I1124 09:38:14.587526 1721889 cli_runner.go:164] Run: docker system info --format "{{json .}}"
I1124 09:38:14.689798 1721889 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-11-24 09:38:14.680351757 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aa
rch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pa
th:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
I1124 09:38:14.689920 1721889 api_server.go:166] Checking apiserver status ...
I1124 09:38:14.689974 1721889 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
I1124 09:38:14.690117 1721889 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
I1124 09:38:14.713529 1721889 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
W1124 09:38:14.832012 1721889 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
stdout:

                                                
                                                
stderr:
I1124 09:38:14.835421 1721889 out.go:179] * The control-plane node functional-291288 apiserver is not running: (state=Stopped)
I1124 09:38:14.838334 1721889 out.go:179]   To start a cluster, run: "minikube start -p functional-291288"

                                                
                                                
stdout: * The control-plane node functional-291288 apiserver is not running: (state=Stopped)
To start a cluster, run: "minikube start -p functional-291288"
functional_test_tunnel_test.go:194: (dbg) stopping [out/minikube-linux-arm64 -p functional-291288 tunnel --alsologtostderr] ...
helpers_test.go:507: unable to find parent, assuming dead: process does not exist
functional_test_tunnel_test.go:194: read stdout failed: read |0: file already closed
functional_test_tunnel_test.go:194: (dbg) [out/minikube-linux-arm64 -p functional-291288 tunnel --alsologtostderr] stdout:
functional_test_tunnel_test.go:194: read stderr failed: read |0: file already closed
functional_test_tunnel_test.go:194: (dbg) [out/minikube-linux-arm64 -p functional-291288 tunnel --alsologtostderr] stderr:
functional_test_tunnel_test.go:194: (dbg) stopping [out/minikube-linux-arm64 -p functional-291288 tunnel --alsologtostderr] ...
helpers_test.go:525: unable to kill pid 1721888: os: process already finished
functional_test_tunnel_test.go:194: read stdout failed: read |0: file already closed
functional_test_tunnel_test.go:194: (dbg) [out/minikube-linux-arm64 -p functional-291288 tunnel --alsologtostderr] stdout:
functional_test_tunnel_test.go:194: read stderr failed: read |0: file already closed
functional_test_tunnel_test.go:194: (dbg) [out/minikube-linux-arm64 -p functional-291288 tunnel --alsologtostderr] stderr:
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/RunSecondTunnel (0.47s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/WaitService/Setup (0.09s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/WaitService/Setup
functional_test_tunnel_test.go:212: (dbg) Run:  kubectl --context functional-291288 apply -f testdata/testsvc.yaml
functional_test_tunnel_test.go:212: (dbg) Non-zero exit: kubectl --context functional-291288 apply -f testdata/testsvc.yaml: exit status 1 (93.202487ms)

                                                
                                                
** stderr ** 
	error: error validating "testdata/testsvc.yaml": error validating data: failed to download openapi: Get "https://192.168.49.2:8441/openapi/v2?timeout=32s": dial tcp 192.168.49.2:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false

                                                
                                                
** /stderr **
functional_test_tunnel_test.go:214: kubectl --context functional-291288 apply -f testdata/testsvc.yaml failed: exit status 1
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/WaitService/Setup (0.09s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/AccessDirect (84.17s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/AccessDirect
I1124 09:38:15.047253 1654467 retry.go:31] will retry after 3.270304657s: Temporary Error: Get "http:": http: no Host in request URL
functional_test_tunnel_test.go:288: failed to hit nginx at "http://": Temporary Error: Get "http:": http: no Host in request URL
functional_test_tunnel_test.go:290: (dbg) Run:  kubectl --context functional-291288 get svc nginx-svc
functional_test_tunnel_test.go:290: (dbg) Non-zero exit: kubectl --context functional-291288 get svc nginx-svc: exit status 1 (55.174214ms)

                                                
                                                
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test_tunnel_test.go:292: kubectl --context functional-291288 get svc nginx-svc failed: exit status 1
functional_test_tunnel_test.go:294: failed to kubectl get svc nginx-svc:
functional_test_tunnel_test.go:301: expected body to contain "Welcome to nginx!", but got *""*
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/AccessDirect (84.17s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/DeployApp (0.06s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/DeployApp
functional_test.go:1451: (dbg) Run:  kubectl --context functional-291288 create deployment hello-node --image kicbase/echo-server
functional_test.go:1451: (dbg) Non-zero exit: kubectl --context functional-291288 create deployment hello-node --image kicbase/echo-server: exit status 1 (57.154567ms)

                                                
                                                
** stderr ** 
	error: failed to create deployment: Post "https://192.168.49.2:8441/apis/apps/v1/namespaces/default/deployments?fieldManager=kubectl-create&fieldValidation=Strict": dial tcp 192.168.49.2:8441: connect: connection refused

                                                
                                                
** /stderr **
functional_test.go:1453: failed to create hello-node deployment with this command "kubectl --context functional-291288 create deployment hello-node --image kicbase/echo-server": exit status 1.
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/DeployApp (0.06s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/List (0.28s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/List
functional_test.go:1469: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 service list
functional_test.go:1469: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-291288 service list: exit status 103 (280.62376ms)

                                                
                                                
-- stdout --
	* The control-plane node functional-291288 apiserver is not running: (state=Stopped)
	  To start a cluster, run: "minikube start -p functional-291288"

                                                
                                                
-- /stdout --
functional_test.go:1471: failed to do service list. args "out/minikube-linux-arm64 -p functional-291288 service list" : exit status 103
functional_test.go:1474: expected 'service list' to contain *hello-node* but got -"* The control-plane node functional-291288 apiserver is not running: (state=Stopped)\n  To start a cluster, run: \"minikube start -p functional-291288\"\n"-
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/List (0.28s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/JSONOutput (0.28s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/JSONOutput
functional_test.go:1499: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 service list -o json
functional_test.go:1499: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-291288 service list -o json: exit status 103 (277.516151ms)

                                                
                                                
-- stdout --
	* The control-plane node functional-291288 apiserver is not running: (state=Stopped)
	  To start a cluster, run: "minikube start -p functional-291288"

                                                
                                                
-- /stdout --
functional_test.go:1501: failed to list services with json format. args "out/minikube-linux-arm64 -p functional-291288 service list -o json": exit status 103
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/JSONOutput (0.28s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/HTTPS (0.27s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/HTTPS
functional_test.go:1519: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 service --namespace=default --https --url hello-node
functional_test.go:1519: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-291288 service --namespace=default --https --url hello-node: exit status 103 (274.283962ms)

                                                
                                                
-- stdout --
	* The control-plane node functional-291288 apiserver is not running: (state=Stopped)
	  To start a cluster, run: "minikube start -p functional-291288"

                                                
                                                
-- /stdout --
functional_test.go:1521: failed to get service url. args "out/minikube-linux-arm64 -p functional-291288 service --namespace=default --https --url hello-node" : exit status 103
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/HTTPS (0.27s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/Format (0.26s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/Format
functional_test.go:1550: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 service hello-node --url --format={{.IP}}
functional_test.go:1550: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-291288 service hello-node --url --format={{.IP}}: exit status 103 (255.609282ms)

                                                
                                                
-- stdout --
	* The control-plane node functional-291288 apiserver is not running: (state=Stopped)
	  To start a cluster, run: "minikube start -p functional-291288"

                                                
                                                
-- /stdout --
functional_test.go:1552: failed to get service url with custom format. args "out/minikube-linux-arm64 -p functional-291288 service hello-node --url --format={{.IP}}": exit status 103
functional_test.go:1558: "* The control-plane node functional-291288 apiserver is not running: (state=Stopped)\n  To start a cluster, run: \"minikube start -p functional-291288\"" is not a valid IP
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/Format (0.26s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/URL (0.3s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/URL
functional_test.go:1569: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 service hello-node --url
functional_test.go:1569: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-291288 service hello-node --url: exit status 103 (298.712669ms)

                                                
                                                
-- stdout --
	* The control-plane node functional-291288 apiserver is not running: (state=Stopped)
	  To start a cluster, run: "minikube start -p functional-291288"

                                                
                                                
-- /stdout --
functional_test.go:1571: failed to get service url. args: "out/minikube-linux-arm64 -p functional-291288 service hello-node --url": exit status 103
functional_test.go:1575: found endpoint for hello-node: * The control-plane node functional-291288 apiserver is not running: (state=Stopped)
To start a cluster, run: "minikube start -p functional-291288"
functional_test.go:1579: failed to parse "* The control-plane node functional-291288 apiserver is not running: (state=Stopped)\n  To start a cluster, run: \"minikube start -p functional-291288\"": parse "* The control-plane node functional-291288 apiserver is not running: (state=Stopped)\n  To start a cluster, run: \"minikube start -p functional-291288\"": net/url: invalid control character in URL
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/URL (0.30s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MountCmd/any-port (2.41s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MountCmd/any-port
functional_test_mount_test.go:73: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-291288 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo1536350623/001:/mount-9p --alsologtostderr -v=1]
functional_test_mount_test.go:107: wrote "test-1763977187076229275" to /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo1536350623/001/created-by-test
functional_test_mount_test.go:107: wrote "test-1763977187076229275" to /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo1536350623/001/created-by-test-removed-by-pod
functional_test_mount_test.go:107: wrote "test-1763977187076229275" to /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo1536350623/001/test-1763977187076229275
functional_test_mount_test.go:115: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:115: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-291288 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (331.485996ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
I1124 09:39:47.408007 1654467 retry.go:31] will retry after 487.26293ms: exit status 1
functional_test_mount_test.go:115: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:129: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 ssh -- ls -la /mount-9p
functional_test_mount_test.go:133: guest mount directory contents
total 2
-rw-r--r-- 1 docker docker 24 Nov 24 09:39 created-by-test
-rw-r--r-- 1 docker docker 24 Nov 24 09:39 created-by-test-removed-by-pod
-rw-r--r-- 1 docker docker 24 Nov 24 09:39 test-1763977187076229275
functional_test_mount_test.go:137: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 ssh cat /mount-9p/test-1763977187076229275
functional_test_mount_test.go:148: (dbg) Run:  kubectl --context functional-291288 replace --force -f testdata/busybox-mount-test.yaml
functional_test_mount_test.go:148: (dbg) Non-zero exit: kubectl --context functional-291288 replace --force -f testdata/busybox-mount-test.yaml: exit status 1 (61.861955ms)

                                                
                                                
** stderr ** 
	error: error when deleting "testdata/busybox-mount-test.yaml": Delete "https://192.168.49.2:8441/api/v1/namespaces/default/pods/busybox-mount": dial tcp 192.168.49.2:8441: connect: connection refused

                                                
                                                
** /stderr **
functional_test_mount_test.go:150: failed to 'kubectl replace' for busybox-mount-test. args "kubectl --context functional-291288 replace --force -f testdata/busybox-mount-test.yaml" : exit status 1
functional_test_mount_test.go:80: "TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MountCmd/any-port" failed, getting debug info...
functional_test_mount_test.go:81: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 ssh "mount | grep 9p; ls -la /mount-9p; cat /mount-9p/pod-dates"
functional_test_mount_test.go:81: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-291288 ssh "mount | grep 9p; ls -la /mount-9p; cat /mount-9p/pod-dates": exit status 1 (271.355406ms)

                                                
                                                
-- stdout --
	192.168.49.1 on /mount-9p type 9p (rw,relatime,sync,dirsync,dfltuid=1000,dfltgid=997,access=any,msize=262144,trans=tcp,noextend,port=37549)
	total 2
	-rw-r--r-- 1 docker docker 24 Nov 24 09:39 created-by-test
	-rw-r--r-- 1 docker docker 24 Nov 24 09:39 created-by-test-removed-by-pod
	-rw-r--r-- 1 docker docker 24 Nov 24 09:39 test-1763977187076229275
	cat: /mount-9p/pod-dates: No such file or directory

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:83: debugging command "out/minikube-linux-arm64 -p functional-291288 ssh \"mount | grep 9p; ls -la /mount-9p; cat /mount-9p/pod-dates\"" failed : exit status 1
functional_test_mount_test.go:90: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:94: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-291288 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo1536350623/001:/mount-9p --alsologtostderr -v=1] ...
functional_test_mount_test.go:94: (dbg) [out/minikube-linux-arm64 mount -p functional-291288 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo1536350623/001:/mount-9p --alsologtostderr -v=1] stdout:
* Mounting host path /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo1536350623/001 into VM as /mount-9p ...
- Mount type:   9p
- User ID:      docker
- Group ID:     docker
- Version:      9p2000.L
- Message Size: 262144
- Options:      map[]
- Bind Address: 192.168.49.1:37549
* Userspace file server: 
ufs starting
* Successfully mounted /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo1536350623/001 to /mount-9p

                                                
                                                
* NOTE: This process must stay alive for the mount to be accessible ...
* Unmounting /mount-9p ...

                                                
                                                

                                                
                                                
functional_test_mount_test.go:94: (dbg) [out/minikube-linux-arm64 mount -p functional-291288 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo1536350623/001:/mount-9p --alsologtostderr -v=1] stderr:
I1124 09:39:47.134084 1723900 out.go:360] Setting OutFile to fd 1 ...
I1124 09:39:47.134877 1723900 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1124 09:39:47.134907 1723900 out.go:374] Setting ErrFile to fd 2...
I1124 09:39:47.134924 1723900 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1124 09:39:47.135196 1723900 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
I1124 09:39:47.135473 1723900 mustload.go:66] Loading cluster: functional-291288
I1124 09:39:47.135859 1723900 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
I1124 09:39:47.136506 1723900 cli_runner.go:164] Run: docker container inspect functional-291288 --format={{.State.Status}}
I1124 09:39:47.155592 1723900 host.go:66] Checking if "functional-291288" exists ...
I1124 09:39:47.155885 1723900 cli_runner.go:164] Run: docker system info --format "{{json .}}"
I1124 09:39:47.248042 1723900 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-11-24 09:39:47.236391222 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aa
rch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pa
th:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
I1124 09:39:47.248220 1723900 cli_runner.go:164] Run: docker network inspect functional-291288 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
I1124 09:39:47.273148 1723900 out.go:179] * Mounting host path /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo1536350623/001 into VM as /mount-9p ...
I1124 09:39:47.276188 1723900 out.go:179]   - Mount type:   9p
I1124 09:39:47.279014 1723900 out.go:179]   - User ID:      docker
I1124 09:39:47.281943 1723900 out.go:179]   - Group ID:     docker
I1124 09:39:47.284825 1723900 out.go:179]   - Version:      9p2000.L
I1124 09:39:47.287984 1723900 out.go:179]   - Message Size: 262144
I1124 09:39:47.290782 1723900 out.go:179]   - Options:      map[]
I1124 09:39:47.293546 1723900 out.go:179]   - Bind Address: 192.168.49.1:37549
I1124 09:39:47.296396 1723900 out.go:179] * Userspace file server: 
I1124 09:39:47.296724 1723900 ssh_runner.go:195] Run: /bin/bash -c "[ "x$(findmnt -T /mount-9p | grep /mount-9p)" != "x" ] && sudo umount -f -l /mount-9p || echo "
I1124 09:39:47.296831 1723900 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
I1124 09:39:47.322199 1723900 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
I1124 09:39:47.429171 1723900 mount.go:180] unmount for /mount-9p ran successfully
I1124 09:39:47.429226 1723900 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /mount-9p"
I1124 09:39:47.437755 1723900 ssh_runner.go:195] Run: /bin/bash -c "sudo mount -t 9p -o dfltgid=$(grep ^docker: /etc/group | cut -d: -f3),dfltuid=$(id -u docker),msize=262144,port=37549,trans=tcp,version=9p2000.L 192.168.49.1 /mount-9p"
I1124 09:39:47.450645 1723900 main.go:127] stdlog: ufs.go:141 connected
I1124 09:39:47.450805 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tversion tag 65535 msize 262144 version '9P2000.L'
I1124 09:39:47.450867 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rversion tag 65535 msize 262144 version '9P2000'
I1124 09:39:47.451102 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tattach tag 0 fid 0 afid 4294967295 uname 'nobody' nuname 0 aname ''
I1124 09:39:47.451169 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rattach tag 0 aqid (ed771a b53bcf00 'd')
I1124 09:39:47.451439 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tstat tag 0 fid 0
I1124 09:39:47.451489 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rstat tag 0 st ('001' 'jenkins' 'jenkins' '' q (ed771a b53bcf00 'd') m d775 at 0 mt 1763977187 l 4096 t 0 d 0 ext )
I1124 09:39:47.454278 1723900 lock.go:50] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/.mount-process: {Name:mk9e9c9915765000f46c2a1c3140d595769a54d5 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
I1124 09:39:47.454564 1723900 mount.go:105] mount successful: ""
I1124 09:39:47.457999 1723900 out.go:179] * Successfully mounted /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo1536350623/001 to /mount-9p
I1124 09:39:47.460878 1723900 out.go:203] 
I1124 09:39:47.463784 1723900 out.go:179] * NOTE: This process must stay alive for the mount to be accessible ...
I1124 09:39:48.438883 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tstat tag 0 fid 0
I1124 09:39:48.438966 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rstat tag 0 st ('001' 'jenkins' 'jenkins' '' q (ed771a b53bcf00 'd') m d775 at 0 mt 1763977187 l 4096 t 0 d 0 ext )
I1124 09:39:48.439336 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Twalk tag 0 fid 0 newfid 1 
I1124 09:39:48.439375 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rwalk tag 0 
I1124 09:39:48.439531 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Topen tag 0 fid 1 mode 0
I1124 09:39:48.439604 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Ropen tag 0 qid (ed771a b53bcf00 'd') iounit 0
I1124 09:39:48.439713 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tstat tag 0 fid 0
I1124 09:39:48.439760 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rstat tag 0 st ('001' 'jenkins' 'jenkins' '' q (ed771a b53bcf00 'd') m d775 at 0 mt 1763977187 l 4096 t 0 d 0 ext )
I1124 09:39:48.439927 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tread tag 0 fid 1 offset 0 count 262120
I1124 09:39:48.440051 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rread tag 0 count 258
I1124 09:39:48.440187 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tread tag 0 fid 1 offset 258 count 261862
I1124 09:39:48.440217 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rread tag 0 count 0
I1124 09:39:48.440349 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tread tag 0 fid 1 offset 258 count 262120
I1124 09:39:48.440375 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rread tag 0 count 0
I1124 09:39:48.440500 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Twalk tag 0 fid 0 newfid 2 0:'created-by-test' 
I1124 09:39:48.440532 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rwalk tag 0 (ed771b b53bcf00 '') 
I1124 09:39:48.440659 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tstat tag 0 fid 2
I1124 09:39:48.440705 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rstat tag 0 st ('created-by-test' 'jenkins' 'jenkins' '' q (ed771b b53bcf00 '') m 644 at 0 mt 1763977187 l 24 t 0 d 0 ext )
I1124 09:39:48.440837 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tstat tag 0 fid 2
I1124 09:39:48.440874 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rstat tag 0 st ('created-by-test' 'jenkins' 'jenkins' '' q (ed771b b53bcf00 '') m 644 at 0 mt 1763977187 l 24 t 0 d 0 ext )
I1124 09:39:48.441002 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tclunk tag 0 fid 2
I1124 09:39:48.441025 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rclunk tag 0
I1124 09:39:48.441148 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Twalk tag 0 fid 0 newfid 2 0:'test-1763977187076229275' 
I1124 09:39:48.441180 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rwalk tag 0 (ed771d b53bcf00 '') 
I1124 09:39:48.441306 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tstat tag 0 fid 2
I1124 09:39:48.441339 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rstat tag 0 st ('test-1763977187076229275' 'jenkins' 'jenkins' '' q (ed771d b53bcf00 '') m 644 at 0 mt 1763977187 l 24 t 0 d 0 ext )
I1124 09:39:48.441464 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tstat tag 0 fid 2
I1124 09:39:48.441495 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rstat tag 0 st ('test-1763977187076229275' 'jenkins' 'jenkins' '' q (ed771d b53bcf00 '') m 644 at 0 mt 1763977187 l 24 t 0 d 0 ext )
I1124 09:39:48.441617 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tclunk tag 0 fid 2
I1124 09:39:48.441637 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rclunk tag 0
I1124 09:39:48.441761 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Twalk tag 0 fid 0 newfid 2 0:'created-by-test-removed-by-pod' 
I1124 09:39:48.441837 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rwalk tag 0 (ed771c b53bcf00 '') 
I1124 09:39:48.441976 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tstat tag 0 fid 2
I1124 09:39:48.442034 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rstat tag 0 st ('created-by-test-removed-by-pod' 'jenkins' 'jenkins' '' q (ed771c b53bcf00 '') m 644 at 0 mt 1763977187 l 24 t 0 d 0 ext )
I1124 09:39:48.442156 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tstat tag 0 fid 2
I1124 09:39:48.442197 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rstat tag 0 st ('created-by-test-removed-by-pod' 'jenkins' 'jenkins' '' q (ed771c b53bcf00 '') m 644 at 0 mt 1763977187 l 24 t 0 d 0 ext )
I1124 09:39:48.442357 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tclunk tag 0 fid 2
I1124 09:39:48.442383 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rclunk tag 0
I1124 09:39:48.442534 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tread tag 0 fid 1 offset 258 count 262120
I1124 09:39:48.442568 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rread tag 0 count 0
I1124 09:39:48.442708 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tclunk tag 0 fid 1
I1124 09:39:48.442738 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rclunk tag 0
I1124 09:39:48.722871 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Twalk tag 0 fid 0 newfid 1 0:'test-1763977187076229275' 
I1124 09:39:48.722969 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rwalk tag 0 (ed771d b53bcf00 '') 
I1124 09:39:48.723142 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tstat tag 0 fid 1
I1124 09:39:48.723186 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rstat tag 0 st ('test-1763977187076229275' 'jenkins' 'jenkins' '' q (ed771d b53bcf00 '') m 644 at 0 mt 1763977187 l 24 t 0 d 0 ext )
I1124 09:39:48.723345 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Twalk tag 0 fid 1 newfid 2 
I1124 09:39:48.723411 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rwalk tag 0 
I1124 09:39:48.723523 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Topen tag 0 fid 2 mode 0
I1124 09:39:48.723583 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Ropen tag 0 qid (ed771d b53bcf00 '') iounit 0
I1124 09:39:48.723724 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tstat tag 0 fid 1
I1124 09:39:48.723774 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rstat tag 0 st ('test-1763977187076229275' 'jenkins' 'jenkins' '' q (ed771d b53bcf00 '') m 644 at 0 mt 1763977187 l 24 t 0 d 0 ext )
I1124 09:39:48.723912 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tread tag 0 fid 2 offset 0 count 262120
I1124 09:39:48.723961 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rread tag 0 count 24
I1124 09:39:48.724115 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tread tag 0 fid 2 offset 24 count 262120
I1124 09:39:48.724165 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rread tag 0 count 0
I1124 09:39:48.724295 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tread tag 0 fid 2 offset 24 count 262120
I1124 09:39:48.724336 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rread tag 0 count 0
I1124 09:39:48.724480 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tclunk tag 0 fid 2
I1124 09:39:48.724525 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rclunk tag 0
I1124 09:39:48.724751 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tclunk tag 0 fid 1
I1124 09:39:48.724782 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rclunk tag 0
I1124 09:39:49.061147 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tstat tag 0 fid 0
I1124 09:39:49.061247 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rstat tag 0 st ('001' 'jenkins' 'jenkins' '' q (ed771a b53bcf00 'd') m d775 at 0 mt 1763977187 l 4096 t 0 d 0 ext )
I1124 09:39:49.061591 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Twalk tag 0 fid 0 newfid 1 
I1124 09:39:49.061627 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rwalk tag 0 
I1124 09:39:49.061741 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Topen tag 0 fid 1 mode 0
I1124 09:39:49.061791 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Ropen tag 0 qid (ed771a b53bcf00 'd') iounit 0
I1124 09:39:49.061932 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tstat tag 0 fid 0
I1124 09:39:49.061967 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rstat tag 0 st ('001' 'jenkins' 'jenkins' '' q (ed771a b53bcf00 'd') m d775 at 0 mt 1763977187 l 4096 t 0 d 0 ext )
I1124 09:39:49.062106 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tread tag 0 fid 1 offset 0 count 262120
I1124 09:39:49.062206 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rread tag 0 count 258
I1124 09:39:49.062353 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tread tag 0 fid 1 offset 258 count 261862
I1124 09:39:49.062383 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rread tag 0 count 0
I1124 09:39:49.062507 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tread tag 0 fid 1 offset 258 count 262120
I1124 09:39:49.062535 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rread tag 0 count 0
I1124 09:39:49.062673 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Twalk tag 0 fid 0 newfid 2 0:'created-by-test' 
I1124 09:39:49.062705 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rwalk tag 0 (ed771b b53bcf00 '') 
I1124 09:39:49.062817 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tstat tag 0 fid 2
I1124 09:39:49.062848 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rstat tag 0 st ('created-by-test' 'jenkins' 'jenkins' '' q (ed771b b53bcf00 '') m 644 at 0 mt 1763977187 l 24 t 0 d 0 ext )
I1124 09:39:49.062981 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tstat tag 0 fid 2
I1124 09:39:49.063016 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rstat tag 0 st ('created-by-test' 'jenkins' 'jenkins' '' q (ed771b b53bcf00 '') m 644 at 0 mt 1763977187 l 24 t 0 d 0 ext )
I1124 09:39:49.063140 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tclunk tag 0 fid 2
I1124 09:39:49.063163 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rclunk tag 0
I1124 09:39:49.063297 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Twalk tag 0 fid 0 newfid 2 0:'test-1763977187076229275' 
I1124 09:39:49.063327 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rwalk tag 0 (ed771d b53bcf00 '') 
I1124 09:39:49.063435 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tstat tag 0 fid 2
I1124 09:39:49.063467 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rstat tag 0 st ('test-1763977187076229275' 'jenkins' 'jenkins' '' q (ed771d b53bcf00 '') m 644 at 0 mt 1763977187 l 24 t 0 d 0 ext )
I1124 09:39:49.063603 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tstat tag 0 fid 2
I1124 09:39:49.063633 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rstat tag 0 st ('test-1763977187076229275' 'jenkins' 'jenkins' '' q (ed771d b53bcf00 '') m 644 at 0 mt 1763977187 l 24 t 0 d 0 ext )
I1124 09:39:49.063744 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tclunk tag 0 fid 2
I1124 09:39:49.063774 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rclunk tag 0
I1124 09:39:49.063905 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Twalk tag 0 fid 0 newfid 2 0:'created-by-test-removed-by-pod' 
I1124 09:39:49.063960 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rwalk tag 0 (ed771c b53bcf00 '') 
I1124 09:39:49.064072 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tstat tag 0 fid 2
I1124 09:39:49.064103 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rstat tag 0 st ('created-by-test-removed-by-pod' 'jenkins' 'jenkins' '' q (ed771c b53bcf00 '') m 644 at 0 mt 1763977187 l 24 t 0 d 0 ext )
I1124 09:39:49.064237 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tstat tag 0 fid 2
I1124 09:39:49.064267 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rstat tag 0 st ('created-by-test-removed-by-pod' 'jenkins' 'jenkins' '' q (ed771c b53bcf00 '') m 644 at 0 mt 1763977187 l 24 t 0 d 0 ext )
I1124 09:39:49.064387 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tclunk tag 0 fid 2
I1124 09:39:49.064409 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rclunk tag 0
I1124 09:39:49.064516 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tread tag 0 fid 1 offset 258 count 262120
I1124 09:39:49.064544 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rread tag 0 count 0
I1124 09:39:49.064678 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tclunk tag 0 fid 1
I1124 09:39:49.064705 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rclunk tag 0
I1124 09:39:49.065932 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Twalk tag 0 fid 0 newfid 1 0:'pod-dates' 
I1124 09:39:49.065999 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rerror tag 0 ename 'file not found' ecode 0
I1124 09:39:49.361082 1723900 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:37758 Tclunk tag 0 fid 0
I1124 09:39:49.361133 1723900 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:37758 Rclunk tag 0
I1124 09:39:49.362130 1723900 main.go:127] stdlog: ufs.go:147 disconnected
I1124 09:39:49.384945 1723900 out.go:179] * Unmounting /mount-9p ...
I1124 09:39:49.388088 1723900 ssh_runner.go:195] Run: /bin/bash -c "[ "x$(findmnt -T /mount-9p | grep /mount-9p)" != "x" ] && sudo umount -f -l /mount-9p || echo "
I1124 09:39:49.395349 1723900 mount.go:180] unmount for /mount-9p ran successfully
I1124 09:39:49.395572 1723900 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/.mount-process: {Name:mk9e9c9915765000f46c2a1c3140d595769a54d5 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
I1124 09:39:49.399229 1723900 out.go:203] 
W1124 09:39:49.402323 1723900 out.go:285] X Exiting due to MK_INTERRUPTED: Received terminated signal
X Exiting due to MK_INTERRUPTED: Received terminated signal
I1124 09:39:49.405348 1723900 out.go:203] 
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MountCmd/any-port (2.41s)

                                                
                                    
x
+
TestKubernetesUpgrade (801.64s)

                                                
                                                
=== RUN   TestKubernetesUpgrade
=== PAUSE TestKubernetesUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestKubernetesUpgrade
version_upgrade_test.go:222: (dbg) Run:  out/minikube-linux-arm64 start -p kubernetes-upgrade-188777 --memory=3072 --kubernetes-version=v1.28.0 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd
version_upgrade_test.go:222: (dbg) Done: out/minikube-linux-arm64 start -p kubernetes-upgrade-188777 --memory=3072 --kubernetes-version=v1.28.0 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd: (42.033103288s)
version_upgrade_test.go:227: (dbg) Run:  out/minikube-linux-arm64 stop -p kubernetes-upgrade-188777
version_upgrade_test.go:227: (dbg) Done: out/minikube-linux-arm64 stop -p kubernetes-upgrade-188777: (1.789116581s)
version_upgrade_test.go:232: (dbg) Run:  out/minikube-linux-arm64 -p kubernetes-upgrade-188777 status --format={{.Host}}
version_upgrade_test.go:232: (dbg) Non-zero exit: out/minikube-linux-arm64 -p kubernetes-upgrade-188777 status --format={{.Host}}: exit status 7 (110.563693ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
version_upgrade_test.go:234: status error: exit status 7 (may be ok)
version_upgrade_test.go:243: (dbg) Run:  out/minikube-linux-arm64 start -p kubernetes-upgrade-188777 --memory=3072 --kubernetes-version=v1.35.0-beta.0 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd
E1124 10:11:03.604431 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 10:11:24.717228 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
version_upgrade_test.go:243: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p kubernetes-upgrade-188777 --memory=3072 --kubernetes-version=v1.35.0-beta.0 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd: exit status 109 (12m32.559645006s)

                                                
                                                
-- stdout --
	* [kubernetes-upgrade-188777] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=21978
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/21978-1652607/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/21978-1652607/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the docker driver based on existing profile
	* Starting "kubernetes-upgrade-188777" primary control-plane node in "kubernetes-upgrade-188777" cluster
	* Pulling base image v0.0.48-1763789673-21948 ...
	* Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1124 10:10:58.771060 1856079 out.go:360] Setting OutFile to fd 1 ...
	I1124 10:10:58.771189 1856079 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 10:10:58.771201 1856079 out.go:374] Setting ErrFile to fd 2...
	I1124 10:10:58.771207 1856079 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 10:10:58.771479 1856079 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
	I1124 10:10:58.771850 1856079 out.go:368] Setting JSON to false
	I1124 10:10:58.773120 1856079 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":31988,"bootTime":1763947071,"procs":175,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1124 10:10:58.773199 1856079 start.go:143] virtualization:  
	I1124 10:10:58.781519 1856079 out.go:179] * [kubernetes-upgrade-188777] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1124 10:10:58.784842 1856079 out.go:179]   - MINIKUBE_LOCATION=21978
	I1124 10:10:58.784934 1856079 notify.go:221] Checking for updates...
	I1124 10:10:58.789011 1856079 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1124 10:10:58.792201 1856079 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 10:10:58.795152 1856079 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21978-1652607/.minikube
	I1124 10:10:58.798106 1856079 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1124 10:10:58.801506 1856079 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1124 10:10:58.804884 1856079 config.go:182] Loaded profile config "kubernetes-upgrade-188777": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.28.0
	I1124 10:10:58.805477 1856079 driver.go:422] Setting default libvirt URI to qemu:///system
	I1124 10:10:58.843517 1856079 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1124 10:10:58.843626 1856079 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 10:10:58.954253 1856079 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:2 ContainersRunning:1 ContainersPaused:0 ContainersStopped:1 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:38 OomKillDisable:true NGoroutines:53 SystemTime:2025-11-24 10:10:58.944865784 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 10:10:58.954357 1856079 docker.go:319] overlay module found
	I1124 10:10:58.960004 1856079 out.go:179] * Using the docker driver based on existing profile
	I1124 10:10:58.965369 1856079 start.go:309] selected driver: docker
	I1124 10:10:58.965392 1856079 start.go:927] validating driver "docker" against &{Name:kubernetes-upgrade-188777 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.28.0 ClusterName:kubernetes-upgrade-188777 Namespace:default APIServerHA
VIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.76.2 Port:8443 KubernetesVersion:v1.28.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false Cu
stomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 10:10:58.965504 1856079 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1124 10:10:58.966197 1856079 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 10:10:59.056869 1856079 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:2 ContainersRunning:1 ContainersPaused:0 ContainersStopped:1 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:38 OomKillDisable:true NGoroutines:53 SystemTime:2025-11-24 10:10:59.042165312 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 10:10:59.057215 1856079 cni.go:84] Creating CNI manager for ""
	I1124 10:10:59.057278 1856079 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1124 10:10:59.057334 1856079 start.go:353] cluster config:
	{Name:kubernetes-upgrade-188777 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:kubernetes-upgrade-188777 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain
:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.76.2 Port:8443 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: Stat
icIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 10:10:59.060909 1856079 out.go:179] * Starting "kubernetes-upgrade-188777" primary control-plane node in "kubernetes-upgrade-188777" cluster
	I1124 10:10:59.064212 1856079 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1124 10:10:59.067976 1856079 out.go:179] * Pulling base image v0.0.48-1763789673-21948 ...
	I1124 10:10:59.072140 1856079 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1124 10:10:59.072351 1856079 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f in local docker daemon
	I1124 10:10:59.097730 1856079 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f in local docker daemon, skipping pull
	I1124 10:10:59.097751 1856079 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f exists in daemon, skipping load
	W1124 10:10:59.132347 1856079 preload.go:144] https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.35.0-beta.0/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	W1124 10:10:59.324163 1856079 preload.go:144] https://github.com/kubernetes-sigs/minikube-preloads/releases/download/v18/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	I1124 10:10:59.324321 1856079 profile.go:143] Saving config to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/kubernetes-upgrade-188777/config.json ...
	I1124 10:10:59.324457 1856079 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 10:10:59.324568 1856079 cache.go:243] Successfully downloaded all kic artifacts
	I1124 10:10:59.324598 1856079 start.go:360] acquireMachinesLock for kubernetes-upgrade-188777: {Name:mk9a269c1841b86daa5a018142194d78f478982d Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 10:10:59.324643 1856079 start.go:364] duration metric: took 26.503µs to acquireMachinesLock for "kubernetes-upgrade-188777"
	I1124 10:10:59.324661 1856079 start.go:96] Skipping create...Using existing machine configuration
	I1124 10:10:59.324667 1856079 fix.go:54] fixHost starting: 
	I1124 10:10:59.324927 1856079 cli_runner.go:164] Run: docker container inspect kubernetes-upgrade-188777 --format={{.State.Status}}
	I1124 10:10:59.341942 1856079 fix.go:112] recreateIfNeeded on kubernetes-upgrade-188777: state=Stopped err=<nil>
	W1124 10:10:59.341974 1856079 fix.go:138] unexpected machine state, will restart: <nil>
	I1124 10:10:59.349753 1856079 out.go:252] * Restarting existing docker container for "kubernetes-upgrade-188777" ...
	I1124 10:10:59.349854 1856079 cli_runner.go:164] Run: docker start kubernetes-upgrade-188777
	I1124 10:10:59.739468 1856079 cli_runner.go:164] Run: docker container inspect kubernetes-upgrade-188777 --format={{.State.Status}}
	I1124 10:10:59.776929 1856079 kic.go:430] container "kubernetes-upgrade-188777" state is running.
	I1124 10:10:59.777332 1856079 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" kubernetes-upgrade-188777
	I1124 10:10:59.803567 1856079 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 10:10:59.804085 1856079 profile.go:143] Saving config to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/kubernetes-upgrade-188777/config.json ...
	I1124 10:10:59.804303 1856079 machine.go:94] provisionDockerMachine start ...
	I1124 10:10:59.804366 1856079 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-188777
	I1124 10:10:59.858645 1856079 main.go:143] libmachine: Using SSH client type: native
	I1124 10:10:59.859137 1856079 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34914 <nil> <nil>}
	I1124 10:10:59.859152 1856079 main.go:143] libmachine: About to run SSH command:
	hostname
	I1124 10:10:59.859817 1856079 main.go:143] libmachine: Error dialing TCP: ssh: handshake failed: read tcp 127.0.0.1:41790->127.0.0.1:34914: read: connection reset by peer
	I1124 10:11:00.060588 1856079 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 10:11:00.410860 1856079 cache.go:107] acquiring lock: {Name:mk22a10f0ce1f3295b61e7e76c455d0494a3e278 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 10:11:00.410964 1856079 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 exists
	I1124 10:11:00.410974 1856079 cache.go:96] cache image "gcr.io/k8s-minikube/storage-provisioner:v5" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5" took 132.334µs
	I1124 10:11:00.410983 1856079 cache.go:80] save to tar file gcr.io/k8s-minikube/storage-provisioner:v5 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 succeeded
	I1124 10:11:00.410995 1856079 cache.go:107] acquiring lock: {Name:mk1cf42e67442503a46c578224bd3cb68bf682d4 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 10:11:00.411028 1856079 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 exists
	I1124 10:11:00.411033 1856079 cache.go:96] cache image "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0" took 39.771µs
	I1124 10:11:00.411039 1856079 cache.go:80] save to tar file registry.k8s.io/kube-apiserver:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 succeeded
	I1124 10:11:00.411050 1856079 cache.go:107] acquiring lock: {Name:mkfdc49c8e68aee34cee0c9d441ae8a4dca675c6 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 10:11:00.411077 1856079 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 exists
	I1124 10:11:00.411083 1856079 cache.go:96] cache image "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0" took 33.978µs
	I1124 10:11:00.411097 1856079 cache.go:80] save to tar file registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 succeeded
	I1124 10:11:00.411121 1856079 cache.go:107] acquiring lock: {Name:mkdbf38e05e2c47c1a7a906a2236e9e7020a94c5 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 10:11:00.411152 1856079 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 exists
	I1124 10:11:00.411167 1856079 cache.go:96] cache image "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0" took 59.922µs
	I1124 10:11:00.411174 1856079 cache.go:80] save to tar file registry.k8s.io/kube-scheduler:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 succeeded
	I1124 10:11:00.411184 1856079 cache.go:107] acquiring lock: {Name:mk80fdbe7cdb5bc17c2a82b4ecfd00214559a435 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 10:11:00.411216 1856079 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 exists
	I1124 10:11:00.411221 1856079 cache.go:96] cache image "registry.k8s.io/kube-proxy:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0" took 38.433µs
	I1124 10:11:00.411227 1856079 cache.go:80] save to tar file registry.k8s.io/kube-proxy:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 succeeded
	I1124 10:11:00.411237 1856079 cache.go:107] acquiring lock: {Name:mk85f1502dbb97830776608fb729eb3605e112e6 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 10:11:00.411267 1856079 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 exists
	I1124 10:11:00.411273 1856079 cache.go:96] cache image "registry.k8s.io/pause:3.10.1" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1" took 36.464µs
	I1124 10:11:00.411284 1856079 cache.go:80] save to tar file registry.k8s.io/pause:3.10.1 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 succeeded
	I1124 10:11:00.411294 1856079 cache.go:107] acquiring lock: {Name:mk46ce3b59d7e062b3dbc8a90fe5b4231f256471 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 10:11:00.411321 1856079 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0 exists
	I1124 10:11:00.411326 1856079 cache.go:96] cache image "registry.k8s.io/etcd:3.5.24-0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0" took 32.821µs
	I1124 10:11:00.411332 1856079 cache.go:80] save to tar file registry.k8s.io/etcd:3.5.24-0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0 succeeded
	I1124 10:11:00.411341 1856079 cache.go:107] acquiring lock: {Name:mk726502cb84c177b2e14fee88512325761511c5 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 10:11:00.411367 1856079 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 exists
	I1124 10:11:00.411372 1856079 cache.go:96] cache image "registry.k8s.io/coredns/coredns:v1.13.1" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1" took 33.198µs
	I1124 10:11:00.411379 1856079 cache.go:80] save to tar file registry.k8s.io/coredns/coredns:v1.13.1 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 succeeded
	I1124 10:11:00.411388 1856079 cache.go:87] Successfully saved all images to host disk.
	I1124 10:11:03.039062 1856079 main.go:143] libmachine: SSH cmd err, output: <nil>: kubernetes-upgrade-188777
	
	I1124 10:11:03.039091 1856079 ubuntu.go:182] provisioning hostname "kubernetes-upgrade-188777"
	I1124 10:11:03.039167 1856079 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-188777
	I1124 10:11:03.061532 1856079 main.go:143] libmachine: Using SSH client type: native
	I1124 10:11:03.061855 1856079 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34914 <nil> <nil>}
	I1124 10:11:03.061874 1856079 main.go:143] libmachine: About to run SSH command:
	sudo hostname kubernetes-upgrade-188777 && echo "kubernetes-upgrade-188777" | sudo tee /etc/hostname
	I1124 10:11:03.243010 1856079 main.go:143] libmachine: SSH cmd err, output: <nil>: kubernetes-upgrade-188777
	
	I1124 10:11:03.243172 1856079 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-188777
	I1124 10:11:03.268742 1856079 main.go:143] libmachine: Using SSH client type: native
	I1124 10:11:03.269067 1856079 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34914 <nil> <nil>}
	I1124 10:11:03.269089 1856079 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\skubernetes-upgrade-188777' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 kubernetes-upgrade-188777/g' /etc/hosts;
				else 
					echo '127.0.1.1 kubernetes-upgrade-188777' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1124 10:11:03.427006 1856079 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1124 10:11:03.427034 1856079 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/21978-1652607/.minikube CaCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/21978-1652607/.minikube}
	I1124 10:11:03.427101 1856079 ubuntu.go:190] setting up certificates
	I1124 10:11:03.427122 1856079 provision.go:84] configureAuth start
	I1124 10:11:03.427218 1856079 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" kubernetes-upgrade-188777
	I1124 10:11:03.456420 1856079 provision.go:143] copyHostCerts
	I1124 10:11:03.456507 1856079 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem, removing ...
	I1124 10:11:03.456529 1856079 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem
	I1124 10:11:03.456609 1856079 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem (1078 bytes)
	I1124 10:11:03.456719 1856079 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem, removing ...
	I1124 10:11:03.456730 1856079 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem
	I1124 10:11:03.456764 1856079 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem (1123 bytes)
	I1124 10:11:03.456831 1856079 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem, removing ...
	I1124 10:11:03.456840 1856079 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem
	I1124 10:11:03.456866 1856079 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem (1679 bytes)
	I1124 10:11:03.456930 1856079 provision.go:117] generating server cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem org=jenkins.kubernetes-upgrade-188777 san=[127.0.0.1 192.168.76.2 kubernetes-upgrade-188777 localhost minikube]
	I1124 10:11:03.823876 1856079 provision.go:177] copyRemoteCerts
	I1124 10:11:03.823954 1856079 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1124 10:11:03.824002 1856079 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-188777
	I1124 10:11:03.842948 1856079 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34914 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/kubernetes-upgrade-188777/id_rsa Username:docker}
	I1124 10:11:03.954096 1856079 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem --> /etc/docker/server.pem (1241 bytes)
	I1124 10:11:03.985983 1856079 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I1124 10:11:04.011983 1856079 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1124 10:11:04.034370 1856079 provision.go:87] duration metric: took 607.221049ms to configureAuth
	I1124 10:11:04.034414 1856079 ubuntu.go:206] setting minikube options for container-runtime
	I1124 10:11:04.034621 1856079 config.go:182] Loaded profile config "kubernetes-upgrade-188777": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1124 10:11:04.034634 1856079 machine.go:97] duration metric: took 4.230323342s to provisionDockerMachine
	I1124 10:11:04.034642 1856079 start.go:293] postStartSetup for "kubernetes-upgrade-188777" (driver="docker")
	I1124 10:11:04.034654 1856079 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1124 10:11:04.034710 1856079 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1124 10:11:04.034749 1856079 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-188777
	I1124 10:11:04.056781 1856079 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34914 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/kubernetes-upgrade-188777/id_rsa Username:docker}
	I1124 10:11:04.164613 1856079 ssh_runner.go:195] Run: cat /etc/os-release
	I1124 10:11:04.169889 1856079 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1124 10:11:04.169971 1856079 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1124 10:11:04.169998 1856079 filesync.go:126] Scanning /home/jenkins/minikube-integration/21978-1652607/.minikube/addons for local assets ...
	I1124 10:11:04.170080 1856079 filesync.go:126] Scanning /home/jenkins/minikube-integration/21978-1652607/.minikube/files for local assets ...
	I1124 10:11:04.170197 1856079 filesync.go:149] local asset: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem -> 16544672.pem in /etc/ssl/certs
	I1124 10:11:04.170345 1856079 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I1124 10:11:04.179324 1856079 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem --> /etc/ssl/certs/16544672.pem (1708 bytes)
	I1124 10:11:04.201653 1856079 start.go:296] duration metric: took 166.986024ms for postStartSetup
	I1124 10:11:04.201806 1856079 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1124 10:11:04.201870 1856079 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-188777
	I1124 10:11:04.223272 1856079 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34914 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/kubernetes-upgrade-188777/id_rsa Username:docker}
	I1124 10:11:04.328930 1856079 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1124 10:11:04.335068 1856079 fix.go:56] duration metric: took 5.010393483s for fixHost
	I1124 10:11:04.335099 1856079 start.go:83] releasing machines lock for "kubernetes-upgrade-188777", held for 5.010447153s
	I1124 10:11:04.335191 1856079 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" kubernetes-upgrade-188777
	I1124 10:11:04.357532 1856079 ssh_runner.go:195] Run: cat /version.json
	I1124 10:11:04.357588 1856079 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-188777
	I1124 10:11:04.357841 1856079 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1124 10:11:04.357888 1856079 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-188777
	I1124 10:11:04.383191 1856079 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34914 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/kubernetes-upgrade-188777/id_rsa Username:docker}
	I1124 10:11:04.398269 1856079 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34914 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/kubernetes-upgrade-188777/id_rsa Username:docker}
	I1124 10:11:04.498513 1856079 ssh_runner.go:195] Run: systemctl --version
	I1124 10:11:04.623280 1856079 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1124 10:11:04.628578 1856079 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1124 10:11:04.628656 1856079 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1124 10:11:04.639996 1856079 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1124 10:11:04.640023 1856079 start.go:496] detecting cgroup driver to use...
	I1124 10:11:04.640058 1856079 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1124 10:11:04.640129 1856079 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1124 10:11:04.657440 1856079 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1124 10:11:04.672515 1856079 docker.go:218] disabling cri-docker service (if available) ...
	I1124 10:11:04.672583 1856079 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1124 10:11:04.690040 1856079 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1124 10:11:04.704224 1856079 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1124 10:11:04.841872 1856079 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1124 10:11:04.995366 1856079 docker.go:234] disabling docker service ...
	I1124 10:11:04.995479 1856079 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1124 10:11:05.015271 1856079 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1124 10:11:05.029930 1856079 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1124 10:11:05.179920 1856079 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1124 10:11:05.336981 1856079 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1124 10:11:05.352187 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1124 10:11:05.367475 1856079 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 10:11:05.531337 1856079 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1124 10:11:05.541770 1856079 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1124 10:11:05.551197 1856079 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1124 10:11:05.551297 1856079 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1124 10:11:05.560610 1856079 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1124 10:11:05.570017 1856079 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1124 10:11:05.579403 1856079 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1124 10:11:05.588604 1856079 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1124 10:11:05.597390 1856079 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1124 10:11:05.606944 1856079 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1124 10:11:05.616481 1856079 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1124 10:11:05.625986 1856079 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1124 10:11:05.634401 1856079 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1124 10:11:05.642753 1856079 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1124 10:11:05.789079 1856079 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1124 10:11:06.033498 1856079 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1124 10:11:06.033629 1856079 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1124 10:11:06.038205 1856079 start.go:564] Will wait 60s for crictl version
	I1124 10:11:06.038343 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:11:06.042736 1856079 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1124 10:11:06.071349 1856079 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.1.5
	RuntimeApiVersion:  v1
	I1124 10:11:06.071467 1856079 ssh_runner.go:195] Run: containerd --version
	I1124 10:11:06.094358 1856079 ssh_runner.go:195] Run: containerd --version
	I1124 10:11:06.131527 1856079 out.go:179] * Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	I1124 10:11:06.134662 1856079 cli_runner.go:164] Run: docker network inspect kubernetes-upgrade-188777 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1124 10:11:06.158174 1856079 ssh_runner.go:195] Run: grep 192.168.76.1	host.minikube.internal$ /etc/hosts
	I1124 10:11:06.164144 1856079 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.76.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1124 10:11:06.178658 1856079 kubeadm.go:884] updating cluster {Name:kubernetes-upgrade-188777 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:kubernetes-upgrade-188777 Namespace:default APIServerHAVIP: APISe
rverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.76.2 Port:8443 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false Custo
mQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1124 10:11:06.178856 1856079 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 10:11:06.403271 1856079 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 10:11:06.594381 1856079 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 10:11:06.777678 1856079 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1124 10:11:06.777773 1856079 ssh_runner.go:195] Run: sudo crictl images --output json
	I1124 10:11:06.812530 1856079 containerd.go:623] couldn't find preloaded image for "registry.k8s.io/kube-apiserver:v1.35.0-beta.0". assuming images are not preloaded.
	I1124 10:11:06.812557 1856079 cache_images.go:90] LoadCachedImages start: [registry.k8s.io/kube-apiserver:v1.35.0-beta.0 registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 registry.k8s.io/kube-scheduler:v1.35.0-beta.0 registry.k8s.io/kube-proxy:v1.35.0-beta.0 registry.k8s.io/pause:3.10.1 registry.k8s.io/etcd:3.5.24-0 registry.k8s.io/coredns/coredns:v1.13.1 gcr.io/k8s-minikube/storage-provisioner:v5]
	I1124 10:11:06.812625 1856079 image.go:138] retrieving image: gcr.io/k8s-minikube/storage-provisioner:v5
	I1124 10:11:06.812907 1856079 image.go:138] retrieving image: registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1124 10:11:06.812995 1856079 image.go:138] retrieving image: registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1124 10:11:06.813137 1856079 image.go:138] retrieving image: registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1124 10:11:06.813242 1856079 image.go:138] retrieving image: registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1124 10:11:06.813340 1856079 image.go:138] retrieving image: registry.k8s.io/pause:3.10.1
	I1124 10:11:06.813458 1856079 image.go:138] retrieving image: registry.k8s.io/etcd:3.5.24-0
	I1124 10:11:06.813552 1856079 image.go:138] retrieving image: registry.k8s.io/coredns/coredns:v1.13.1
	I1124 10:11:06.817505 1856079 image.go:181] daemon lookup for registry.k8s.io/coredns/coredns:v1.13.1: Error response from daemon: No such image: registry.k8s.io/coredns/coredns:v1.13.1
	I1124 10:11:06.817972 1856079 image.go:181] daemon lookup for registry.k8s.io/kube-controller-manager:v1.35.0-beta.0: Error response from daemon: No such image: registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1124 10:11:06.818155 1856079 image.go:181] daemon lookup for registry.k8s.io/kube-proxy:v1.35.0-beta.0: Error response from daemon: No such image: registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1124 10:11:06.818272 1856079 image.go:181] daemon lookup for registry.k8s.io/kube-scheduler:v1.35.0-beta.0: Error response from daemon: No such image: registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1124 10:11:06.818409 1856079 image.go:181] daemon lookup for gcr.io/k8s-minikube/storage-provisioner:v5: Error response from daemon: No such image: gcr.io/k8s-minikube/storage-provisioner:v5
	I1124 10:11:06.818659 1856079 image.go:181] daemon lookup for registry.k8s.io/pause:3.10.1: Error response from daemon: No such image: registry.k8s.io/pause:3.10.1
	I1124 10:11:06.818770 1856079 image.go:181] daemon lookup for registry.k8s.io/etcd:3.5.24-0: Error response from daemon: No such image: registry.k8s.io/etcd:3.5.24-0
	I1124 10:11:06.818836 1856079 image.go:181] daemon lookup for registry.k8s.io/kube-apiserver:v1.35.0-beta.0: Error response from daemon: No such image: registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1124 10:11:07.135127 1856079 containerd.go:267] Checking existence of image with name "registry.k8s.io/coredns/coredns:v1.13.1" and sha "e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf"
	I1124 10:11:07.135218 1856079 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/coredns/coredns:v1.13.1
	I1124 10:11:07.155487 1856079 containerd.go:267] Checking existence of image with name "registry.k8s.io/kube-proxy:v1.35.0-beta.0" and sha "404c2e12861777b763b8feaa316d36680fc68ad308a8d2f6e55f1bb981cdd904"
	I1124 10:11:07.155633 1856079 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1124 10:11:07.180000 1856079 containerd.go:267] Checking existence of image with name "registry.k8s.io/pause:3.10.1" and sha "d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd"
	I1124 10:11:07.180119 1856079 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/pause:3.10.1
	I1124 10:11:07.186722 1856079 containerd.go:267] Checking existence of image with name "registry.k8s.io/etcd:3.5.24-0" and sha "1211402d28f5813ed906916bfcdd0a7404c2f9048ef5bb54387a6745bc410eca"
	I1124 10:11:07.186836 1856079 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/etcd:3.5.24-0
	I1124 10:11:07.213959 1856079 containerd.go:267] Checking existence of image with name "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" and sha "68b5f775f18769fcb77bd8474c80bda2050163b6c66f4551f352b7381b8ca5be"
	I1124 10:11:07.214084 1856079 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1124 10:11:07.220217 1856079 cache_images.go:118] "registry.k8s.io/coredns/coredns:v1.13.1" needs transfer: "registry.k8s.io/coredns/coredns:v1.13.1" does not exist at hash "e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf" in container runtime
	I1124 10:11:07.220300 1856079 cri.go:218] Removing image: registry.k8s.io/coredns/coredns:v1.13.1
	I1124 10:11:07.220379 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:11:07.220465 1856079 cache_images.go:118] "registry.k8s.io/kube-proxy:v1.35.0-beta.0" needs transfer: "registry.k8s.io/kube-proxy:v1.35.0-beta.0" does not exist at hash "404c2e12861777b763b8feaa316d36680fc68ad308a8d2f6e55f1bb981cdd904" in container runtime
	I1124 10:11:07.220515 1856079 cri.go:218] Removing image: registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1124 10:11:07.220556 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:11:07.231137 1856079 containerd.go:267] Checking existence of image with name "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" and sha "16378741539f1be9c6e347d127537d379a6592587b09b4eb47964cb5c43a409b"
	I1124 10:11:07.231252 1856079 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1124 10:11:07.253299 1856079 containerd.go:267] Checking existence of image with name "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" and sha "ccd634d9bcc36ac6235e9c86676cd3a02c06afc3788a25f1bbf39ca7d44585f4"
	I1124 10:11:07.253417 1856079 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1124 10:11:07.256003 1856079 cache_images.go:118] "registry.k8s.io/pause:3.10.1" needs transfer: "registry.k8s.io/pause:3.10.1" does not exist at hash "d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd" in container runtime
	I1124 10:11:07.256087 1856079 cri.go:218] Removing image: registry.k8s.io/pause:3.10.1
	I1124 10:11:07.256169 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:11:07.275331 1856079 cache_images.go:118] "registry.k8s.io/etcd:3.5.24-0" needs transfer: "registry.k8s.io/etcd:3.5.24-0" does not exist at hash "1211402d28f5813ed906916bfcdd0a7404c2f9048ef5bb54387a6745bc410eca" in container runtime
	I1124 10:11:07.275436 1856079 cri.go:218] Removing image: registry.k8s.io/etcd:3.5.24-0
	I1124 10:11:07.275514 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:11:07.318868 1856079 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1124 10:11:07.318965 1856079 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/coredns/coredns:v1.13.1
	I1124 10:11:07.319011 1856079 cache_images.go:118] "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" needs transfer: "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" does not exist at hash "16378741539f1be9c6e347d127537d379a6592587b09b4eb47964cb5c43a409b" in container runtime
	I1124 10:11:07.319229 1856079 cri.go:218] Removing image: registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1124 10:11:07.319280 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:11:07.321355 1856079 cache_images.go:118] "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" needs transfer: "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" does not exist at hash "68b5f775f18769fcb77bd8474c80bda2050163b6c66f4551f352b7381b8ca5be" in container runtime
	I1124 10:11:07.321445 1856079 cri.go:218] Removing image: registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1124 10:11:07.321520 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:11:07.327840 1856079 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/pause:3.10.1
	I1124 10:11:07.327957 1856079 cache_images.go:118] "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" needs transfer: "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" does not exist at hash "ccd634d9bcc36ac6235e9c86676cd3a02c06afc3788a25f1bbf39ca7d44585f4" in container runtime
	I1124 10:11:07.328023 1856079 cri.go:218] Removing image: registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1124 10:11:07.328078 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:11:07.328160 1856079 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/etcd:3.5.24-0
	I1124 10:11:07.423822 1856079 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1124 10:11:07.423985 1856079 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/coredns/coredns:v1.13.1
	I1124 10:11:07.424020 1856079 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1124 10:11:07.424046 1856079 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1124 10:11:07.503618 1856079 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/etcd:3.5.24-0
	I1124 10:11:07.503765 1856079 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/pause:3.10.1
	I1124 10:11:07.503843 1856079 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1124 10:11:07.591732 1856079 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1124 10:11:07.591874 1856079 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1124 10:11:07.591961 1856079 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/coredns/coredns:v1.13.1
	I1124 10:11:07.592052 1856079 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1124 10:11:07.714053 1856079 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1124 10:11:07.714166 1856079 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/etcd:3.5.24-0
	I1124 10:11:07.714225 1856079 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/pause:3.10.1
	I1124 10:11:07.807786 1856079 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1124 10:11:07.807886 1856079 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0
	I1124 10:11:07.808148 1856079 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0
	I1124 10:11:07.807949 1856079 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1124 10:11:07.807974 1856079 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1
	I1124 10:11:07.808332 1856079 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/coredns_v1.13.1
	I1124 10:11:07.894778 1856079 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1
	I1124 10:11:07.895055 1856079 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/pause_3.10.1
	I1124 10:11:07.894942 1856079 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0
	I1124 10:11:07.895225 1856079 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/etcd_3.5.24-0
	I1124 10:11:07.894988 1856079 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1124 10:11:07.947711 1856079 ssh_runner.go:352] existence check for /var/lib/minikube/images/coredns_v1.13.1: stat -c "%s %y" /var/lib/minikube/images/coredns_v1.13.1: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/coredns_v1.13.1': No such file or directory
	I1124 10:11:07.947788 1856079 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 --> /var/lib/minikube/images/coredns_v1.13.1 (21178368 bytes)
	I1124 10:11:07.947904 1856079 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0
	I1124 10:11:07.948009 1856079 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0
	I1124 10:11:07.948082 1856079 ssh_runner.go:352] existence check for /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0: stat -c "%s %y" /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/kube-proxy_v1.35.0-beta.0': No such file or directory
	I1124 10:11:07.948122 1856079 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 --> /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0 (22432256 bytes)
	I1124 10:11:07.948194 1856079 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0
	I1124 10:11:07.948272 1856079 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0
	I1124 10:11:08.032015 1856079 ssh_runner.go:352] existence check for /var/lib/minikube/images/pause_3.10.1: stat -c "%s %y" /var/lib/minikube/images/pause_3.10.1: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/pause_3.10.1': No such file or directory
	I1124 10:11:08.032053 1856079 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 --> /var/lib/minikube/images/pause_3.10.1 (268288 bytes)
	I1124 10:11:08.032114 1856079 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0
	I1124 10:11:08.032193 1856079 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0
	I1124 10:11:08.032242 1856079 ssh_runner.go:352] existence check for /var/lib/minikube/images/etcd_3.5.24-0: stat -c "%s %y" /var/lib/minikube/images/etcd_3.5.24-0: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/etcd_3.5.24-0': No such file or directory
	I1124 10:11:08.032255 1856079 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0 --> /var/lib/minikube/images/etcd_3.5.24-0 (21895168 bytes)
	I1124 10:11:08.032314 1856079 ssh_runner.go:352] existence check for /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0: stat -c "%s %y" /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0': No such file or directory
	I1124 10:11:08.032323 1856079 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 --> /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0 (15401984 bytes)
	I1124 10:11:08.032361 1856079 ssh_runner.go:352] existence check for /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0: stat -c "%s %y" /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0': No such file or directory
	I1124 10:11:08.032371 1856079 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 --> /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0 (20672000 bytes)
	I1124 10:11:08.132247 1856079 ssh_runner.go:352] existence check for /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0: stat -c "%s %y" /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0': No such file or directory
	I1124 10:11:08.132342 1856079 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 --> /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0 (24689152 bytes)
	W1124 10:11:08.201228 1856079 image.go:286] image gcr.io/k8s-minikube/storage-provisioner:v5 arch mismatch: want arm64 got amd64. fixing
	I1124 10:11:08.201421 1856079 containerd.go:267] Checking existence of image with name "gcr.io/k8s-minikube/storage-provisioner:v5" and sha "66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51"
	I1124 10:11:08.201507 1856079 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==gcr.io/k8s-minikube/storage-provisioner:v5
	I1124 10:11:08.237242 1856079 containerd.go:285] Loading image: /var/lib/minikube/images/pause_3.10.1
	I1124 10:11:08.237316 1856079 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/pause_3.10.1
	I1124 10:11:08.435929 1856079 cache_images.go:118] "gcr.io/k8s-minikube/storage-provisioner:v5" needs transfer: "gcr.io/k8s-minikube/storage-provisioner:v5" does not exist at hash "66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51" in container runtime
	I1124 10:11:08.436093 1856079 cri.go:218] Removing image: gcr.io/k8s-minikube/storage-provisioner:v5
	I1124 10:11:08.436273 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:11:08.642726 1856079 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi gcr.io/k8s-minikube/storage-provisioner:v5
	I1124 10:11:08.642896 1856079 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 from cache
	I1124 10:11:08.808017 1856079 containerd.go:285] Loading image: /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0
	I1124 10:11:08.808138 1856079 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0
	I1124 10:11:08.842387 1856079 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5
	I1124 10:11:08.842577 1856079 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/storage-provisioner_v5
	I1124 10:11:10.565350 1856079 ssh_runner.go:235] Completed: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0: (1.757063629s)
	I1124 10:11:10.565375 1856079 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 from cache
	I1124 10:11:10.565392 1856079 containerd.go:285] Loading image: /var/lib/minikube/images/coredns_v1.13.1
	I1124 10:11:10.565440 1856079 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/coredns_v1.13.1
	I1124 10:11:10.565497 1856079 ssh_runner.go:235] Completed: stat -c "%s %y" /var/lib/minikube/images/storage-provisioner_v5: (1.722880869s)
	I1124 10:11:10.565512 1856079 ssh_runner.go:352] existence check for /var/lib/minikube/images/storage-provisioner_v5: stat -c "%s %y" /var/lib/minikube/images/storage-provisioner_v5: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/storage-provisioner_v5': No such file or directory
	I1124 10:11:10.565525 1856079 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 --> /var/lib/minikube/images/storage-provisioner_v5 (8035840 bytes)
	I1124 10:11:11.939911 1856079 ssh_runner.go:235] Completed: sudo ctr -n=k8s.io images import /var/lib/minikube/images/coredns_v1.13.1: (1.374448272s)
	I1124 10:11:11.939996 1856079 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 from cache
	I1124 10:11:11.940040 1856079 containerd.go:285] Loading image: /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0
	I1124 10:11:11.940126 1856079 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0
	I1124 10:11:13.306075 1856079 ssh_runner.go:235] Completed: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0: (1.365893597s)
	I1124 10:11:13.306099 1856079 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 from cache
	I1124 10:11:13.306116 1856079 containerd.go:285] Loading image: /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0
	I1124 10:11:13.306161 1856079 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0
	I1124 10:11:14.589800 1856079 ssh_runner.go:235] Completed: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0: (1.283616626s)
	I1124 10:11:14.589824 1856079 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 from cache
	I1124 10:11:14.589843 1856079 containerd.go:285] Loading image: /var/lib/minikube/images/etcd_3.5.24-0
	I1124 10:11:14.589891 1856079 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/etcd_3.5.24-0
	I1124 10:11:16.699561 1856079 ssh_runner.go:235] Completed: sudo ctr -n=k8s.io images import /var/lib/minikube/images/etcd_3.5.24-0: (2.10964602s)
	I1124 10:11:16.699585 1856079 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0 from cache
	I1124 10:11:16.699603 1856079 containerd.go:285] Loading image: /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0
	I1124 10:11:16.699651 1856079 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0
	I1124 10:11:18.366138 1856079 ssh_runner.go:235] Completed: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0: (1.666463592s)
	I1124 10:11:18.366163 1856079 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 from cache
	I1124 10:11:18.366181 1856079 containerd.go:285] Loading image: /var/lib/minikube/images/storage-provisioner_v5
	I1124 10:11:18.366231 1856079 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/storage-provisioner_v5
	I1124 10:11:19.011488 1856079 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 from cache
	I1124 10:11:19.011522 1856079 cache_images.go:125] Successfully loaded all cached images
	I1124 10:11:19.011528 1856079 cache_images.go:94] duration metric: took 12.198955635s to LoadCachedImages
	I1124 10:11:19.011540 1856079 kubeadm.go:935] updating node { 192.168.76.2 8443 v1.35.0-beta.0 containerd true true} ...
	I1124 10:11:19.011642 1856079 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=kubernetes-upgrade-188777 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.76.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-beta.0 ClusterName:kubernetes-upgrade-188777 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1124 10:11:19.011708 1856079 ssh_runner.go:195] Run: sudo crictl info
	I1124 10:11:19.053187 1856079 cni.go:84] Creating CNI manager for ""
	I1124 10:11:19.053216 1856079 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1124 10:11:19.053233 1856079 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1124 10:11:19.053256 1856079 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.76.2 APIServerPort:8443 KubernetesVersion:v1.35.0-beta.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:kubernetes-upgrade-188777 NodeName:kubernetes-upgrade-188777 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.76.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.76.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/
certs/ca.crt StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1124 10:11:19.053369 1856079 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.76.2
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "kubernetes-upgrade-188777"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.76.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.76.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-beta.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1124 10:11:19.053440 1856079 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0
	I1124 10:11:19.066958 1856079 binaries.go:54] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.35.0-beta.0': No such file or directory
	
	Initiating transfer...
	I1124 10:11:19.067074 1856079 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.35.0-beta.0
	I1124 10:11:19.082082 1856079 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubectl?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubectl.sha256
	I1124 10:11:19.082188 1856079 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl
	I1124 10:11:19.082264 1856079 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubelet?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubelet.sha256
	I1124 10:11:19.082291 1856079 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1124 10:11:19.082373 1856079 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 10:11:19.082421 1856079 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubeadm
	I1124 10:11:19.112241 1856079 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.35.0-beta.0/kubeadm: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubeadm: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.35.0-beta.0/kubeadm': No such file or directory
	I1124 10:11:19.112281 1856079 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/linux/arm64/v1.35.0-beta.0/kubeadm --> /var/lib/minikube/binaries/v1.35.0-beta.0/kubeadm (68354232 bytes)
	I1124 10:11:19.112350 1856079 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.35.0-beta.0/kubectl': No such file or directory
	I1124 10:11:19.112362 1856079 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/linux/arm64/v1.35.0-beta.0/kubectl --> /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl (55181496 bytes)
	I1124 10:11:19.131697 1856079 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubelet
	I1124 10:11:19.201229 1856079 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.35.0-beta.0/kubelet: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubelet: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet': No such file or directory
	I1124 10:11:19.201468 1856079 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/linux/arm64/v1.35.0-beta.0/kubelet --> /var/lib/minikube/binaries/v1.35.0-beta.0/kubelet (54329636 bytes)
	I1124 10:11:20.219161 1856079 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1124 10:11:20.231442 1856079 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (336 bytes)
	I1124 10:11:20.251723 1856079 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (359 bytes)
	I1124 10:11:20.278805 1856079 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2245 bytes)
	I1124 10:11:20.299972 1856079 ssh_runner.go:195] Run: grep 192.168.76.2	control-plane.minikube.internal$ /etc/hosts
	I1124 10:11:20.303891 1856079 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.76.2	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1124 10:11:20.325395 1856079 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1124 10:11:20.535788 1856079 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1124 10:11:20.567021 1856079 certs.go:69] Setting up /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/kubernetes-upgrade-188777 for IP: 192.168.76.2
	I1124 10:11:20.567044 1856079 certs.go:195] generating shared ca certs ...
	I1124 10:11:20.567060 1856079 certs.go:227] acquiring lock for ca certs: {Name:mkbe540a30c4376a351176f7fe6fec044d058b09 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 10:11:20.567211 1856079 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.key
	I1124 10:11:20.567261 1856079 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.key
	I1124 10:11:20.567274 1856079 certs.go:257] generating profile certs ...
	I1124 10:11:20.567368 1856079 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/kubernetes-upgrade-188777/client.key
	I1124 10:11:20.567442 1856079 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/kubernetes-upgrade-188777/apiserver.key.730eda0e
	I1124 10:11:20.567494 1856079 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/kubernetes-upgrade-188777/proxy-client.key
	I1124 10:11:20.567612 1856079 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467.pem (1338 bytes)
	W1124 10:11:20.567654 1856079 certs.go:480] ignoring /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467_empty.pem, impossibly tiny 0 bytes
	I1124 10:11:20.567667 1856079 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem (1671 bytes)
	I1124 10:11:20.567694 1856079 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem (1078 bytes)
	I1124 10:11:20.567723 1856079 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem (1123 bytes)
	I1124 10:11:20.567749 1856079 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem (1679 bytes)
	I1124 10:11:20.567799 1856079 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem (1708 bytes)
	I1124 10:11:20.568462 1856079 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1124 10:11:20.608505 1856079 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1124 10:11:20.666996 1856079 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1124 10:11:20.702010 1856079 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1124 10:11:20.736483 1856079 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/kubernetes-upgrade-188777/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1436 bytes)
	I1124 10:11:20.784637 1856079 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/kubernetes-upgrade-188777/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I1124 10:11:20.822876 1856079 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/kubernetes-upgrade-188777/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1124 10:11:20.856142 1856079 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/kubernetes-upgrade-188777/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1124 10:11:20.887758 1856079 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem --> /usr/share/ca-certificates/16544672.pem (1708 bytes)
	I1124 10:11:20.923578 1856079 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1124 10:11:20.948375 1856079 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467.pem --> /usr/share/ca-certificates/1654467.pem (1338 bytes)
	I1124 10:11:20.983449 1856079 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1124 10:11:21.003195 1856079 ssh_runner.go:195] Run: openssl version
	I1124 10:11:21.023723 1856079 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/16544672.pem && ln -fs /usr/share/ca-certificates/16544672.pem /etc/ssl/certs/16544672.pem"
	I1124 10:11:21.036238 1856079 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/16544672.pem
	I1124 10:11:21.041812 1856079 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Nov 24 09:10 /usr/share/ca-certificates/16544672.pem
	I1124 10:11:21.041883 1856079 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/16544672.pem
	I1124 10:11:21.096807 1856079 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/16544672.pem /etc/ssl/certs/3ec20f2e.0"
	I1124 10:11:21.111488 1856079 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1124 10:11:21.133346 1856079 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1124 10:11:21.137298 1856079 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Nov 24 08:44 /usr/share/ca-certificates/minikubeCA.pem
	I1124 10:11:21.137369 1856079 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1124 10:11:21.185404 1856079 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1124 10:11:21.192875 1856079 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1654467.pem && ln -fs /usr/share/ca-certificates/1654467.pem /etc/ssl/certs/1654467.pem"
	I1124 10:11:21.200525 1856079 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1654467.pem
	I1124 10:11:21.207430 1856079 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Nov 24 09:10 /usr/share/ca-certificates/1654467.pem
	I1124 10:11:21.207494 1856079 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1654467.pem
	I1124 10:11:21.248871 1856079 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1654467.pem /etc/ssl/certs/51391683.0"
	I1124 10:11:21.256560 1856079 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1124 10:11:21.260658 1856079 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1124 10:11:21.305780 1856079 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1124 10:11:21.352107 1856079 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1124 10:11:21.416864 1856079 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1124 10:11:21.527063 1856079 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1124 10:11:21.601218 1856079 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1124 10:11:21.653826 1856079 kubeadm.go:401] StartCluster: {Name:kubernetes-upgrade-188777 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:kubernetes-upgrade-188777 Namespace:default APIServerHAVIP: APIServe
rName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.76.2 Port:8443 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQe
muFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 10:11:21.653926 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1124 10:11:21.653988 1856079 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1124 10:11:21.719854 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:11:21.719877 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:11:21.719882 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:11:21.719897 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:11:21.719901 1856079 cri.go:89] found id: ""
	I1124 10:11:21.719955 1856079 ssh_runner.go:195] Run: sudo runc --root /run/containerd/runc/k8s.io list -f json
	W1124 10:11:21.744186 1856079 kubeadm.go:408] unpause failed: list paused: runc: sudo runc --root /run/containerd/runc/k8s.io list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-11-24T10:11:21Z" level=error msg="open /run/containerd/runc/k8s.io: no such file or directory"
	I1124 10:11:21.744262 1856079 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1124 10:11:21.753938 1856079 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1124 10:11:21.753957 1856079 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1124 10:11:21.754020 1856079 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1124 10:11:21.763762 1856079 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1124 10:11:21.764164 1856079 kubeconfig.go:47] verify endpoint returned: get endpoint: "kubernetes-upgrade-188777" does not appear in /home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 10:11:21.764285 1856079 kubeconfig.go:62] /home/jenkins/minikube-integration/21978-1652607/kubeconfig needs updating (will repair): [kubeconfig missing "kubernetes-upgrade-188777" cluster setting kubeconfig missing "kubernetes-upgrade-188777" context setting]
	I1124 10:11:21.764572 1856079 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/kubeconfig: {Name:mk02121ae6148bede61eabf0ed4e1826024715f4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 10:11:21.765126 1856079 kapi.go:59] client config for kubernetes-upgrade-188777: &rest.Config{Host:"https://192.168.76.2:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/kubernetes-upgrade-188777/client.crt", KeyFile:"/home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/kubernetes-upgrade-188777/client.key", CAFile:"/home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8
(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb2df0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1124 10:11:21.765663 1856079 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1124 10:11:21.765681 1856079 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1124 10:11:21.765687 1856079 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1124 10:11:21.765691 1856079 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1124 10:11:21.765700 1856079 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1124 10:11:21.765973 1856079 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1124 10:11:21.777902 1856079 kubeadm.go:645] detected kubeadm config drift (will reconfigure cluster from new /var/tmp/minikube/kubeadm.yaml):
	-- stdout --
	--- /var/tmp/minikube/kubeadm.yaml	2025-11-24 10:10:30.471758210 +0000
	+++ /var/tmp/minikube/kubeadm.yaml.new	2025-11-24 10:11:20.296283608 +0000
	@@ -1,4 +1,4 @@
	-apiVersion: kubeadm.k8s.io/v1beta3
	+apiVersion: kubeadm.k8s.io/v1beta4
	 kind: InitConfiguration
	 localAPIEndpoint:
	   advertiseAddress: 192.168.76.2
	@@ -14,31 +14,34 @@
	   criSocket: unix:///run/containerd/containerd.sock
	   name: "kubernetes-upgrade-188777"
	   kubeletExtraArgs:
	-    node-ip: 192.168.76.2
	+    - name: "node-ip"
	+      value: "192.168.76.2"
	   taints: []
	 ---
	-apiVersion: kubeadm.k8s.io/v1beta3
	+apiVersion: kubeadm.k8s.io/v1beta4
	 kind: ClusterConfiguration
	 apiServer:
	   certSANs: ["127.0.0.1", "localhost", "192.168.76.2"]
	   extraArgs:
	-    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	+    - name: "enable-admission-plugins"
	+      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	 controllerManager:
	   extraArgs:
	-    allocate-node-cidrs: "true"
	-    leader-elect: "false"
	+    - name: "allocate-node-cidrs"
	+      value: "true"
	+    - name: "leader-elect"
	+      value: "false"
	 scheduler:
	   extraArgs:
	-    leader-elect: "false"
	+    - name: "leader-elect"
	+      value: "false"
	 certificatesDir: /var/lib/minikube/certs
	 clusterName: mk
	 controlPlaneEndpoint: control-plane.minikube.internal:8443
	 etcd:
	   local:
	     dataDir: /var/lib/minikube/etcd
	-    extraArgs:
	-      proxy-refresh-interval: "70000"
	-kubernetesVersion: v1.28.0
	+kubernetesVersion: v1.35.0-beta.0
	 networking:
	   dnsDomain: cluster.local
	   podSubnet: "10.244.0.0/16"
	
	-- /stdout --
	I1124 10:11:21.777931 1856079 kubeadm.go:1161] stopping kube-system containers ...
	I1124 10:11:21.777945 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name: Namespaces:[kube-system]}
	I1124 10:11:21.778001 1856079 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1124 10:11:21.832558 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:11:21.832581 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:11:21.832586 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:11:21.832590 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:11:21.832593 1856079 cri.go:89] found id: ""
	I1124 10:11:21.832598 1856079 cri.go:252] Stopping containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:11:21.832656 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:11:21.836817 1856079 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl stop --timeout=10 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834
	I1124 10:11:21.872016 1856079 ssh_runner.go:195] Run: sudo systemctl stop kubelet
	I1124 10:11:21.891073 1856079 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1124 10:11:21.899906 1856079 kubeadm.go:158] found existing configuration files:
	-rw------- 1 root root 5639 Nov 24 10:10 /etc/kubernetes/admin.conf
	-rw------- 1 root root 5656 Nov 24 10:10 /etc/kubernetes/controller-manager.conf
	-rw------- 1 root root 2039 Nov 24 10:10 /etc/kubernetes/kubelet.conf
	-rw------- 1 root root 5604 Nov 24 10:10 /etc/kubernetes/scheduler.conf
	
	I1124 10:11:21.899985 1856079 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I1124 10:11:21.908485 1856079 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I1124 10:11:21.917258 1856079 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I1124 10:11:21.925472 1856079 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1124 10:11:21.925540 1856079 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1124 10:11:21.932843 1856079 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I1124 10:11:21.940578 1856079 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1124 10:11:21.940649 1856079 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1124 10:11:21.947801 1856079 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1124 10:11:21.955946 1856079 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase certs all --config /var/tmp/minikube/kubeadm.yaml"
	I1124 10:11:22.018000 1856079 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml"
	I1124 10:11:23.121341 1856079 ssh_runner.go:235] Completed: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml": (1.103303143s)
	I1124 10:11:23.121411 1856079 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubelet-start --config /var/tmp/minikube/kubeadm.yaml"
	I1124 10:11:23.377436 1856079 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase control-plane all --config /var/tmp/minikube/kubeadm.yaml"
	I1124 10:11:23.497539 1856079 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase etcd local --config /var/tmp/minikube/kubeadm.yaml"
	I1124 10:11:23.567766 1856079 api_server.go:52] waiting for apiserver process to appear ...
	I1124 10:11:23.567917 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:24.068043 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:24.568784 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:25.068386 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:25.568009 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:26.068709 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:26.568892 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:27.068121 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:27.568202 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:28.069010 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:28.568327 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:29.068854 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:29.569018 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:30.068891 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:30.568107 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:31.068680 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:31.568457 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:32.068221 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:32.568268 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:33.068983 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:33.568047 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:34.067981 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:34.568603 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:35.068064 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:35.568623 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:36.068888 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:36.568024 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:37.068724 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:37.568865 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:38.068087 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:38.568926 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:39.068662 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:39.568519 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:40.068232 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:40.568694 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:41.068023 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:41.568011 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:42.068997 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:42.568156 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:43.068096 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:43.568935 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:44.068739 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:44.568039 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:45.068173 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:45.568819 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:46.068111 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:46.568836 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:47.067999 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:47.568048 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:48.068750 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:48.568061 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:49.068783 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:49.568031 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:50.068793 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:50.568070 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:51.068738 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:51.568847 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:52.068317 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:52.568643 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:53.068041 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:53.568510 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:54.067992 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:54.569082 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:55.068493 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:55.568063 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:56.068839 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:56.568900 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:57.068617 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:57.568045 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:58.068396 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:58.568662 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:59.068743 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:11:59.568752 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:00.112392 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:00.568643 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:01.068150 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:01.567992 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:02.068097 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:02.568875 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:03.069017 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:03.568057 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:04.068125 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:04.568585 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:05.068137 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:05.568963 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:06.068033 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:06.568942 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:07.068007 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:07.568890 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:08.068759 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:08.568077 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:09.067999 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:09.568082 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:10.068942 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:10.568060 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:11.068107 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:11.568077 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:12.068971 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:12.568589 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:13.068087 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:13.568954 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:14.068590 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:14.568004 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:15.068032 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:15.568611 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:16.068027 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:16.568863 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:17.068040 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:17.568028 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:18.068683 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:18.568806 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:19.068730 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:19.568107 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:20.068047 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:20.568436 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:21.068312 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:21.568719 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:22.068158 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:22.568737 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:23.068268 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:23.568212 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:12:23.568327 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:12:23.645389 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:12:23.645413 1856079 cri.go:89] found id: ""
	I1124 10:12:23.645422 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:12:23.645485 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:23.658701 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:12:23.658779 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:12:23.728170 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:12:23.728189 1856079 cri.go:89] found id: ""
	I1124 10:12:23.728197 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:12:23.728267 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:23.741463 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:12:23.741535 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:12:23.797049 1856079 cri.go:89] found id: ""
	I1124 10:12:23.797070 1856079 logs.go:282] 0 containers: []
	W1124 10:12:23.797079 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:12:23.797086 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:12:23.797146 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:12:23.857951 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:12:23.857972 1856079 cri.go:89] found id: ""
	I1124 10:12:23.857980 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:12:23.858041 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:23.861970 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:12:23.862067 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:12:23.900820 1856079 cri.go:89] found id: ""
	I1124 10:12:23.900842 1856079 logs.go:282] 0 containers: []
	W1124 10:12:23.900851 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:12:23.900858 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:12:23.900918 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:12:23.930899 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:12:23.930977 1856079 cri.go:89] found id: ""
	I1124 10:12:23.931000 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:12:23.931087 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:23.935451 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:12:23.935530 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:12:23.964222 1856079 cri.go:89] found id: ""
	I1124 10:12:23.964244 1856079 logs.go:282] 0 containers: []
	W1124 10:12:23.964253 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:12:23.964260 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:12:23.964321 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:12:23.999853 1856079 cri.go:89] found id: ""
	I1124 10:12:23.999875 1856079 logs.go:282] 0 containers: []
	W1124 10:12:23.999884 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:12:23.999897 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:12:23.999909 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:12:24.047448 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:12:24.047528 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:12:24.090399 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:12:24.090440 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:12:24.195621 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:12:24.195701 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:12:24.292886 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:12:24.292911 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:12:24.292925 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:12:24.340653 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:12:24.340690 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:12:24.400587 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:12:24.400627 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:12:24.451667 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:12:24.451701 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:12:24.518978 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:12:24.519008 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:12:27.051292 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:27.061433 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:12:27.061505 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:12:27.090284 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:12:27.090303 1856079 cri.go:89] found id: ""
	I1124 10:12:27.090311 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:12:27.090368 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:27.094611 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:12:27.094681 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:12:27.125289 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:12:27.125315 1856079 cri.go:89] found id: ""
	I1124 10:12:27.125325 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:12:27.125444 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:27.129779 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:12:27.129858 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:12:27.160450 1856079 cri.go:89] found id: ""
	I1124 10:12:27.160478 1856079 logs.go:282] 0 containers: []
	W1124 10:12:27.160489 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:12:27.160496 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:12:27.160575 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:12:27.200430 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:12:27.200455 1856079 cri.go:89] found id: ""
	I1124 10:12:27.200465 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:12:27.200528 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:27.205311 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:12:27.205388 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:12:27.233668 1856079 cri.go:89] found id: ""
	I1124 10:12:27.233694 1856079 logs.go:282] 0 containers: []
	W1124 10:12:27.233702 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:12:27.233710 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:12:27.233769 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:12:27.263678 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:12:27.263703 1856079 cri.go:89] found id: ""
	I1124 10:12:27.263711 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:12:27.263766 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:27.267740 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:12:27.267832 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:12:27.307712 1856079 cri.go:89] found id: ""
	I1124 10:12:27.307738 1856079 logs.go:282] 0 containers: []
	W1124 10:12:27.307747 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:12:27.307753 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:12:27.307811 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:12:27.340956 1856079 cri.go:89] found id: ""
	I1124 10:12:27.340982 1856079 logs.go:282] 0 containers: []
	W1124 10:12:27.340991 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:12:27.341004 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:12:27.341019 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:12:27.393884 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:12:27.393927 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:12:27.434522 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:12:27.434555 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:12:27.479512 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:12:27.479549 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:12:27.503419 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:12:27.503452 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:12:27.608355 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:12:27.608378 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:12:27.608391 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:12:27.682918 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:12:27.682963 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:12:27.776524 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:12:27.776557 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:12:27.830331 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:12:27.830360 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:12:30.415591 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:30.435469 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:12:30.435557 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:12:30.479817 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:12:30.479851 1856079 cri.go:89] found id: ""
	I1124 10:12:30.479859 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:12:30.479916 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:30.487612 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:12:30.487696 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:12:30.522520 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:12:30.522545 1856079 cri.go:89] found id: ""
	I1124 10:12:30.522555 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:12:30.522614 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:30.527233 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:12:30.527312 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:12:30.558779 1856079 cri.go:89] found id: ""
	I1124 10:12:30.558805 1856079 logs.go:282] 0 containers: []
	W1124 10:12:30.558813 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:12:30.558820 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:12:30.558881 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:12:30.616101 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:12:30.616127 1856079 cri.go:89] found id: ""
	I1124 10:12:30.616136 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:12:30.616194 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:30.626409 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:12:30.626529 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:12:30.662390 1856079 cri.go:89] found id: ""
	I1124 10:12:30.662413 1856079 logs.go:282] 0 containers: []
	W1124 10:12:30.662422 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:12:30.662428 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:12:30.662528 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:12:30.703089 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:12:30.703108 1856079 cri.go:89] found id: ""
	I1124 10:12:30.703116 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:12:30.703202 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:30.707217 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:12:30.707291 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:12:30.736958 1856079 cri.go:89] found id: ""
	I1124 10:12:30.736980 1856079 logs.go:282] 0 containers: []
	W1124 10:12:30.736989 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:12:30.736996 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:12:30.737061 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:12:30.772556 1856079 cri.go:89] found id: ""
	I1124 10:12:30.772579 1856079 logs.go:282] 0 containers: []
	W1124 10:12:30.772588 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:12:30.772605 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:12:30.772616 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:12:30.832933 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:12:30.833013 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:12:30.853236 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:12:30.853329 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:12:30.948815 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:12:30.948834 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:12:30.948847 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:12:30.995360 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:12:30.995436 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:12:31.033941 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:12:31.033976 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:12:31.090378 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:12:31.090416 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:12:31.166472 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:12:31.166516 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:12:31.203368 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:12:31.203449 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:12:33.746607 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:33.758664 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:12:33.758736 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:12:33.794246 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:12:33.794264 1856079 cri.go:89] found id: ""
	I1124 10:12:33.794272 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:12:33.794317 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:33.798487 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:12:33.798561 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:12:33.836479 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:12:33.836498 1856079 cri.go:89] found id: ""
	I1124 10:12:33.836506 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:12:33.836569 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:33.840534 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:12:33.840616 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:12:33.876922 1856079 cri.go:89] found id: ""
	I1124 10:12:33.876943 1856079 logs.go:282] 0 containers: []
	W1124 10:12:33.876952 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:12:33.876958 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:12:33.877024 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:12:33.913874 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:12:33.913945 1856079 cri.go:89] found id: ""
	I1124 10:12:33.913967 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:12:33.914053 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:33.918362 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:12:33.918431 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:12:33.957809 1856079 cri.go:89] found id: ""
	I1124 10:12:33.957831 1856079 logs.go:282] 0 containers: []
	W1124 10:12:33.957839 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:12:33.957846 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:12:33.957903 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:12:33.997085 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:12:33.997105 1856079 cri.go:89] found id: ""
	I1124 10:12:33.997113 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:12:33.997172 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:34.002649 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:12:34.002811 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:12:34.048721 1856079 cri.go:89] found id: ""
	I1124 10:12:34.048751 1856079 logs.go:282] 0 containers: []
	W1124 10:12:34.048768 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:12:34.048776 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:12:34.048840 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:12:34.102205 1856079 cri.go:89] found id: ""
	I1124 10:12:34.102234 1856079 logs.go:282] 0 containers: []
	W1124 10:12:34.102244 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:12:34.102261 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:12:34.102272 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:12:34.143872 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:12:34.143913 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:12:34.189782 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:12:34.189819 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:12:34.254300 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:12:34.254337 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:12:34.272165 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:12:34.272193 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:12:34.362090 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:12:34.362113 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:12:34.362126 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:12:34.450999 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:12:34.451037 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:12:34.503093 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:12:34.503127 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:12:34.544934 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:12:34.544970 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:12:37.079086 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:37.089666 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:12:37.089738 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:12:37.120294 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:12:37.120319 1856079 cri.go:89] found id: ""
	I1124 10:12:37.120328 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:12:37.120393 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:37.124173 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:12:37.124249 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:12:37.150277 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:12:37.150307 1856079 cri.go:89] found id: ""
	I1124 10:12:37.150317 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:12:37.150376 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:37.154267 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:12:37.154341 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:12:37.179954 1856079 cri.go:89] found id: ""
	I1124 10:12:37.179977 1856079 logs.go:282] 0 containers: []
	W1124 10:12:37.179987 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:12:37.179994 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:12:37.180062 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:12:37.208059 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:12:37.208139 1856079 cri.go:89] found id: ""
	I1124 10:12:37.208162 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:12:37.208225 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:37.212225 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:12:37.212301 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:12:37.239278 1856079 cri.go:89] found id: ""
	I1124 10:12:37.239300 1856079 logs.go:282] 0 containers: []
	W1124 10:12:37.239309 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:12:37.239316 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:12:37.239379 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:12:37.264811 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:12:37.264885 1856079 cri.go:89] found id: ""
	I1124 10:12:37.264909 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:12:37.264985 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:37.268823 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:12:37.268902 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:12:37.298835 1856079 cri.go:89] found id: ""
	I1124 10:12:37.298864 1856079 logs.go:282] 0 containers: []
	W1124 10:12:37.298874 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:12:37.298882 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:12:37.298957 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:12:37.324057 1856079 cri.go:89] found id: ""
	I1124 10:12:37.324081 1856079 logs.go:282] 0 containers: []
	W1124 10:12:37.324090 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:12:37.324105 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:12:37.324157 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:12:37.371606 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:12:37.371642 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:12:37.428845 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:12:37.428882 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:12:37.497719 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:12:37.497759 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:12:37.549770 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:12:37.549802 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:12:37.604815 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:12:37.604852 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:12:37.692086 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:12:37.692114 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:12:37.797879 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:12:37.797920 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:12:37.817641 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:12:37.817675 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:12:37.940438 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:12:40.440719 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:40.452164 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:12:40.452243 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:12:40.484790 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:12:40.484817 1856079 cri.go:89] found id: ""
	I1124 10:12:40.484825 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:12:40.484879 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:40.489308 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:12:40.489398 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:12:40.527029 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:12:40.527054 1856079 cri.go:89] found id: ""
	I1124 10:12:40.527076 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:12:40.527129 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:40.531589 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:12:40.531711 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:12:40.563943 1856079 cri.go:89] found id: ""
	I1124 10:12:40.563972 1856079 logs.go:282] 0 containers: []
	W1124 10:12:40.563981 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:12:40.563987 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:12:40.564046 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:12:40.595495 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:12:40.595528 1856079 cri.go:89] found id: ""
	I1124 10:12:40.595598 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:12:40.595732 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:40.602179 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:12:40.602255 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:12:40.653305 1856079 cri.go:89] found id: ""
	I1124 10:12:40.653338 1856079 logs.go:282] 0 containers: []
	W1124 10:12:40.653350 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:12:40.653359 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:12:40.653433 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:12:40.700667 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:12:40.700695 1856079 cri.go:89] found id: ""
	I1124 10:12:40.700710 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:12:40.700799 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:40.706942 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:12:40.707055 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:12:40.761149 1856079 cri.go:89] found id: ""
	I1124 10:12:40.761183 1856079 logs.go:282] 0 containers: []
	W1124 10:12:40.761192 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:12:40.761202 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:12:40.761283 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:12:40.839258 1856079 cri.go:89] found id: ""
	I1124 10:12:40.839286 1856079 logs.go:282] 0 containers: []
	W1124 10:12:40.839295 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:12:40.839345 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:12:40.839365 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:12:40.858506 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:12:40.858541 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:12:40.957696 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:12:40.957721 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:12:40.957737 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:12:41.007908 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:12:41.007949 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:12:41.079192 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:12:41.079228 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:12:41.146107 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:12:41.146146 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:12:41.195299 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:12:41.195342 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:12:41.235465 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:12:41.235498 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:12:41.271883 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:12:41.271927 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:12:43.807028 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:43.817135 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:12:43.817211 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:12:43.848693 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:12:43.848718 1856079 cri.go:89] found id: ""
	I1124 10:12:43.848726 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:12:43.848785 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:43.852694 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:12:43.852780 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:12:43.889824 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:12:43.889848 1856079 cri.go:89] found id: ""
	I1124 10:12:43.889856 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:12:43.889914 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:43.894503 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:12:43.894590 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:12:43.937061 1856079 cri.go:89] found id: ""
	I1124 10:12:43.937102 1856079 logs.go:282] 0 containers: []
	W1124 10:12:43.937124 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:12:43.937136 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:12:43.937204 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:12:43.966664 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:12:43.966727 1856079 cri.go:89] found id: ""
	I1124 10:12:43.966756 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:12:43.966834 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:43.970396 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:12:43.970496 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:12:43.996284 1856079 cri.go:89] found id: ""
	I1124 10:12:43.996352 1856079 logs.go:282] 0 containers: []
	W1124 10:12:43.996377 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:12:43.996400 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:12:43.996496 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:12:44.024743 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:12:44.024778 1856079 cri.go:89] found id: ""
	I1124 10:12:44.024788 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:12:44.024866 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:44.028927 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:12:44.029053 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:12:44.055978 1856079 cri.go:89] found id: ""
	I1124 10:12:44.056005 1856079 logs.go:282] 0 containers: []
	W1124 10:12:44.056014 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:12:44.056024 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:12:44.056086 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:12:44.084369 1856079 cri.go:89] found id: ""
	I1124 10:12:44.084440 1856079 logs.go:282] 0 containers: []
	W1124 10:12:44.084455 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:12:44.084470 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:12:44.084482 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:12:44.145182 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:12:44.145218 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:12:44.209913 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:12:44.209935 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:12:44.209950 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:12:44.238593 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:12:44.238623 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:12:44.254881 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:12:44.254912 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:12:44.289441 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:12:44.289477 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:12:44.324475 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:12:44.324508 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:12:44.368575 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:12:44.368607 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:12:44.403036 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:12:44.403070 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:12:46.938632 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:46.959035 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:12:46.959113 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:12:46.992901 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:12:46.992926 1856079 cri.go:89] found id: ""
	I1124 10:12:46.992934 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:12:46.992991 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:46.997047 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:12:46.997122 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:12:47.026640 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:12:47.026665 1856079 cri.go:89] found id: ""
	I1124 10:12:47.026673 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:12:47.026730 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:47.031324 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:12:47.031421 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:12:47.061762 1856079 cri.go:89] found id: ""
	I1124 10:12:47.061789 1856079 logs.go:282] 0 containers: []
	W1124 10:12:47.061798 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:12:47.061806 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:12:47.061869 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:12:47.092837 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:12:47.092861 1856079 cri.go:89] found id: ""
	I1124 10:12:47.092869 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:12:47.092924 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:47.096968 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:12:47.097049 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:12:47.126843 1856079 cri.go:89] found id: ""
	I1124 10:12:47.126881 1856079 logs.go:282] 0 containers: []
	W1124 10:12:47.126891 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:12:47.126899 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:12:47.126999 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:12:47.169073 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:12:47.169100 1856079 cri.go:89] found id: ""
	I1124 10:12:47.169110 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:12:47.169169 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:47.174303 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:12:47.174423 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:12:47.208822 1856079 cri.go:89] found id: ""
	I1124 10:12:47.208847 1856079 logs.go:282] 0 containers: []
	W1124 10:12:47.208856 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:12:47.208862 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:12:47.208924 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:12:47.238605 1856079 cri.go:89] found id: ""
	I1124 10:12:47.238630 1856079 logs.go:282] 0 containers: []
	W1124 10:12:47.238640 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:12:47.238678 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:12:47.238699 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:12:47.285930 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:12:47.285961 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:12:47.321528 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:12:47.321564 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:12:47.339348 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:12:47.339377 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:12:47.388014 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:12:47.388048 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:12:47.449594 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:12:47.449623 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:12:47.513932 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:12:47.514019 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:12:47.600651 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:12:47.600669 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:12:47.600682 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:12:47.666326 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:12:47.666368 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:12:50.227638 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:50.251834 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:12:50.251912 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:12:50.295316 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:12:50.295343 1856079 cri.go:89] found id: ""
	I1124 10:12:50.295353 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:12:50.295466 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:50.305290 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:12:50.305368 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:12:50.366714 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:12:50.366739 1856079 cri.go:89] found id: ""
	I1124 10:12:50.366748 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:12:50.366807 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:50.375044 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:12:50.375130 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:12:50.419225 1856079 cri.go:89] found id: ""
	I1124 10:12:50.419253 1856079 logs.go:282] 0 containers: []
	W1124 10:12:50.419263 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:12:50.419270 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:12:50.419334 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:12:50.466862 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:12:50.466888 1856079 cri.go:89] found id: ""
	I1124 10:12:50.466898 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:12:50.466992 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:50.475006 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:12:50.475090 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:12:50.518732 1856079 cri.go:89] found id: ""
	I1124 10:12:50.518760 1856079 logs.go:282] 0 containers: []
	W1124 10:12:50.518770 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:12:50.518777 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:12:50.518837 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:12:50.562706 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:12:50.562732 1856079 cri.go:89] found id: ""
	I1124 10:12:50.562741 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:12:50.562806 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:50.569497 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:12:50.569575 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:12:50.618327 1856079 cri.go:89] found id: ""
	I1124 10:12:50.618365 1856079 logs.go:282] 0 containers: []
	W1124 10:12:50.618375 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:12:50.618401 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:12:50.618499 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:12:50.672654 1856079 cri.go:89] found id: ""
	I1124 10:12:50.672678 1856079 logs.go:282] 0 containers: []
	W1124 10:12:50.672688 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:12:50.672703 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:12:50.672715 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:12:50.773390 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:12:50.773424 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:12:50.862988 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:12:50.863027 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:12:50.951067 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:12:50.951118 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:12:51.017770 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:12:51.017819 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:12:51.081279 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:12:51.081317 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:12:51.116586 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:12:51.116681 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:12:51.167883 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:12:51.167990 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:12:51.195516 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:12:51.195544 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:12:51.306209 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:12:53.806939 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:53.820017 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:12:53.820091 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:12:53.861728 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:12:53.861750 1856079 cri.go:89] found id: ""
	I1124 10:12:53.861758 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:12:53.861816 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:53.866867 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:12:53.866941 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:12:53.932837 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:12:53.932862 1856079 cri.go:89] found id: ""
	I1124 10:12:53.932869 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:12:53.932926 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:53.955235 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:12:53.955334 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:12:54.013499 1856079 cri.go:89] found id: ""
	I1124 10:12:54.013528 1856079 logs.go:282] 0 containers: []
	W1124 10:12:54.013537 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:12:54.013545 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:12:54.013660 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:12:54.064798 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:12:54.064822 1856079 cri.go:89] found id: ""
	I1124 10:12:54.064830 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:12:54.064918 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:54.069152 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:12:54.069259 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:12:54.125276 1856079 cri.go:89] found id: ""
	I1124 10:12:54.125305 1856079 logs.go:282] 0 containers: []
	W1124 10:12:54.125314 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:12:54.125320 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:12:54.125403 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:12:54.172821 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:12:54.172894 1856079 cri.go:89] found id: ""
	I1124 10:12:54.172916 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:12:54.172998 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:54.178601 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:12:54.178721 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:12:54.228694 1856079 cri.go:89] found id: ""
	I1124 10:12:54.228764 1856079 logs.go:282] 0 containers: []
	W1124 10:12:54.228797 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:12:54.228819 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:12:54.228913 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:12:54.276886 1856079 cri.go:89] found id: ""
	I1124 10:12:54.276916 1856079 logs.go:282] 0 containers: []
	W1124 10:12:54.276926 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:12:54.276940 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:12:54.276953 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:12:54.340316 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:12:54.340345 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:12:54.432438 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:12:54.432477 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:12:54.452527 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:12:54.452559 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:12:54.515299 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:12:54.515332 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:12:54.571278 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:12:54.571313 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:12:54.615444 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:12:54.615480 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:12:54.654856 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:12:54.654893 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:12:54.763254 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:12:54.763288 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:12:54.763301 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:12:57.353980 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:12:57.364120 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:12:57.364216 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:12:57.389782 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:12:57.389803 1856079 cri.go:89] found id: ""
	I1124 10:12:57.389811 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:12:57.389868 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:57.393561 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:12:57.393636 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:12:57.418943 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:12:57.418966 1856079 cri.go:89] found id: ""
	I1124 10:12:57.418975 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:12:57.419030 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:57.422593 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:12:57.422670 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:12:57.449298 1856079 cri.go:89] found id: ""
	I1124 10:12:57.449322 1856079 logs.go:282] 0 containers: []
	W1124 10:12:57.449330 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:12:57.449337 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:12:57.449396 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:12:57.488851 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:12:57.488870 1856079 cri.go:89] found id: ""
	I1124 10:12:57.488899 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:12:57.488957 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:57.493609 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:12:57.493690 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:12:57.528386 1856079 cri.go:89] found id: ""
	I1124 10:12:57.528441 1856079 logs.go:282] 0 containers: []
	W1124 10:12:57.528454 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:12:57.528464 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:12:57.528584 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:12:57.562947 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:12:57.562966 1856079 cri.go:89] found id: ""
	I1124 10:12:57.562973 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:12:57.563026 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:12:57.568322 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:12:57.568393 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:12:57.605365 1856079 cri.go:89] found id: ""
	I1124 10:12:57.605386 1856079 logs.go:282] 0 containers: []
	W1124 10:12:57.605395 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:12:57.605401 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:12:57.605461 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:12:57.653829 1856079 cri.go:89] found id: ""
	I1124 10:12:57.653859 1856079 logs.go:282] 0 containers: []
	W1124 10:12:57.653868 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:12:57.653883 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:12:57.653894 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:12:57.695324 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:12:57.695414 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:12:57.715264 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:12:57.715343 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:12:57.831315 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:12:57.831386 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:12:57.831414 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:12:57.888062 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:12:57.888095 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:12:57.955517 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:12:57.955601 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:12:58.032307 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:12:58.032352 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:12:58.107455 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:12:58.107491 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:12:58.160035 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:12:58.160069 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:13:00.703475 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:13:00.716351 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:13:00.716438 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:13:00.744517 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:13:00.744541 1856079 cri.go:89] found id: ""
	I1124 10:13:00.744551 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:13:00.744608 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:00.748487 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:13:00.748561 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:13:00.775659 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:13:00.775684 1856079 cri.go:89] found id: ""
	I1124 10:13:00.775693 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:13:00.775749 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:00.779659 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:13:00.779739 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:13:00.817903 1856079 cri.go:89] found id: ""
	I1124 10:13:00.817934 1856079 logs.go:282] 0 containers: []
	W1124 10:13:00.817943 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:13:00.817950 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:13:00.818016 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:13:00.852328 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:13:00.852348 1856079 cri.go:89] found id: ""
	I1124 10:13:00.852357 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:13:00.852417 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:00.856127 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:13:00.856205 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:13:00.884396 1856079 cri.go:89] found id: ""
	I1124 10:13:00.884424 1856079 logs.go:282] 0 containers: []
	W1124 10:13:00.884433 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:13:00.884439 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:13:00.884501 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:13:00.910864 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:13:00.910904 1856079 cri.go:89] found id: ""
	I1124 10:13:00.910912 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:13:00.910970 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:00.918415 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:13:00.918510 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:13:00.955825 1856079 cri.go:89] found id: ""
	I1124 10:13:00.955851 1856079 logs.go:282] 0 containers: []
	W1124 10:13:00.955860 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:13:00.955867 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:13:00.955959 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:13:00.985084 1856079 cri.go:89] found id: ""
	I1124 10:13:00.985108 1856079 logs.go:282] 0 containers: []
	W1124 10:13:00.985117 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:13:00.985131 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:13:00.985142 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:13:01.027265 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:13:01.027293 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:13:01.089066 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:13:01.089108 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:13:01.106837 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:13:01.106921 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:13:01.218241 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:13:01.218316 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:13:01.218345 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:13:01.255014 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:13:01.255091 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:13:01.311418 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:13:01.311504 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:13:01.362850 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:13:01.362925 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:13:01.417020 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:13:01.417100 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:13:03.954567 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:13:03.965462 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:13:03.965547 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:13:03.993793 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:13:03.993869 1856079 cri.go:89] found id: ""
	I1124 10:13:03.993893 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:13:03.993980 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:03.998334 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:13:03.998409 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:13:04.027658 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:13:04.027686 1856079 cri.go:89] found id: ""
	I1124 10:13:04.027695 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:13:04.027765 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:04.031955 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:13:04.032037 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:13:04.062637 1856079 cri.go:89] found id: ""
	I1124 10:13:04.062661 1856079 logs.go:282] 0 containers: []
	W1124 10:13:04.062671 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:13:04.062677 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:13:04.062739 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:13:04.090549 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:13:04.090574 1856079 cri.go:89] found id: ""
	I1124 10:13:04.090583 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:13:04.090643 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:04.094638 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:13:04.094710 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:13:04.122646 1856079 cri.go:89] found id: ""
	I1124 10:13:04.122671 1856079 logs.go:282] 0 containers: []
	W1124 10:13:04.122682 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:13:04.122688 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:13:04.122751 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:13:04.160820 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:13:04.160842 1856079 cri.go:89] found id: ""
	I1124 10:13:04.160851 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:13:04.160925 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:04.166060 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:13:04.166141 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:13:04.198087 1856079 cri.go:89] found id: ""
	I1124 10:13:04.198112 1856079 logs.go:282] 0 containers: []
	W1124 10:13:04.198120 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:13:04.198127 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:13:04.198191 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:13:04.228212 1856079 cri.go:89] found id: ""
	I1124 10:13:04.228241 1856079 logs.go:282] 0 containers: []
	W1124 10:13:04.228251 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:13:04.228265 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:13:04.228277 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:13:04.269110 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:13:04.269143 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:13:04.302754 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:13:04.302792 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:13:04.333139 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:13:04.333168 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:13:04.393465 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:13:04.393508 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:13:04.419438 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:13:04.419477 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:13:04.458652 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:13:04.458691 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:13:04.497686 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:13:04.497718 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:13:04.539315 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:13:04.539346 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:13:04.608004 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:13:07.108283 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:13:07.118751 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:13:07.118825 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:13:07.164420 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:13:07.164446 1856079 cri.go:89] found id: ""
	I1124 10:13:07.164454 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:13:07.164513 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:07.168956 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:13:07.169035 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:13:07.198704 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:13:07.198730 1856079 cri.go:89] found id: ""
	I1124 10:13:07.198738 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:13:07.198795 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:07.202706 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:13:07.202781 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:13:07.227849 1856079 cri.go:89] found id: ""
	I1124 10:13:07.227876 1856079 logs.go:282] 0 containers: []
	W1124 10:13:07.227886 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:13:07.227892 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:13:07.227983 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:13:07.257991 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:13:07.258019 1856079 cri.go:89] found id: ""
	I1124 10:13:07.258029 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:13:07.258089 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:07.263182 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:13:07.263268 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:13:07.296324 1856079 cri.go:89] found id: ""
	I1124 10:13:07.296347 1856079 logs.go:282] 0 containers: []
	W1124 10:13:07.296356 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:13:07.296363 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:13:07.296519 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:13:07.345158 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:13:07.345182 1856079 cri.go:89] found id: ""
	I1124 10:13:07.345191 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:13:07.345248 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:07.352846 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:13:07.352921 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:13:07.408912 1856079 cri.go:89] found id: ""
	I1124 10:13:07.408935 1856079 logs.go:282] 0 containers: []
	W1124 10:13:07.408944 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:13:07.408951 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:13:07.409053 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:13:07.444506 1856079 cri.go:89] found id: ""
	I1124 10:13:07.444531 1856079 logs.go:282] 0 containers: []
	W1124 10:13:07.444540 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:13:07.444555 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:13:07.444566 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:13:07.497085 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:13:07.497127 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:13:07.550164 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:13:07.550200 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:13:07.592238 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:13:07.592278 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:13:07.626601 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:13:07.626627 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:13:07.697852 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:13:07.697931 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:13:07.721507 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:13:07.721605 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:13:07.988923 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:13:07.988946 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:13:07.988960 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:13:08.215412 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:13:08.215463 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:13:10.795508 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:13:10.806026 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:13:10.806099 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:13:10.830954 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:13:10.830978 1856079 cri.go:89] found id: ""
	I1124 10:13:10.830987 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:13:10.831043 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:10.834355 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:13:10.834431 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:13:10.861302 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:13:10.861326 1856079 cri.go:89] found id: ""
	I1124 10:13:10.861334 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:13:10.861392 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:10.865200 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:13:10.865276 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:13:10.889766 1856079 cri.go:89] found id: ""
	I1124 10:13:10.889792 1856079 logs.go:282] 0 containers: []
	W1124 10:13:10.889807 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:13:10.889814 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:13:10.889872 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:13:10.916749 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:13:10.916777 1856079 cri.go:89] found id: ""
	I1124 10:13:10.916786 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:13:10.916844 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:10.920503 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:13:10.920574 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:13:10.945503 1856079 cri.go:89] found id: ""
	I1124 10:13:10.945530 1856079 logs.go:282] 0 containers: []
	W1124 10:13:10.945538 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:13:10.945545 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:13:10.945611 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:13:10.970793 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:13:10.970818 1856079 cri.go:89] found id: ""
	I1124 10:13:10.970826 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:13:10.970889 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:10.974344 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:13:10.974482 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:13:10.998592 1856079 cri.go:89] found id: ""
	I1124 10:13:10.998658 1856079 logs.go:282] 0 containers: []
	W1124 10:13:10.998683 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:13:10.998706 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:13:10.998791 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:13:11.026568 1856079 cri.go:89] found id: ""
	I1124 10:13:11.026645 1856079 logs.go:282] 0 containers: []
	W1124 10:13:11.026669 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:13:11.026709 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:13:11.026738 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:13:11.061921 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:13:11.061956 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:13:11.095320 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:13:11.095355 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:13:11.130878 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:13:11.130978 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:13:11.180243 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:13:11.180274 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:13:11.243711 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:13:11.243747 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:13:11.302601 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:13:11.302624 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:13:11.302637 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:13:11.334720 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:13:11.334752 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:13:11.367735 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:13:11.367764 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:13:13.885806 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:13:13.895686 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:13:13.895758 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:13:13.920706 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:13:13.920732 1856079 cri.go:89] found id: ""
	I1124 10:13:13.920740 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:13:13.920798 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:13.924511 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:13:13.924579 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:13:13.962816 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:13:13.962847 1856079 cri.go:89] found id: ""
	I1124 10:13:13.962856 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:13:13.962912 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:13.967104 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:13:13.967188 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:13:14.007854 1856079 cri.go:89] found id: ""
	I1124 10:13:14.007887 1856079 logs.go:282] 0 containers: []
	W1124 10:13:14.007897 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:13:14.007905 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:13:14.007977 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:13:14.046496 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:13:14.046516 1856079 cri.go:89] found id: ""
	I1124 10:13:14.046524 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:13:14.046644 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:14.050951 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:13:14.051021 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:13:14.081919 1856079 cri.go:89] found id: ""
	I1124 10:13:14.081946 1856079 logs.go:282] 0 containers: []
	W1124 10:13:14.081955 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:13:14.081962 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:13:14.082031 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:13:14.109852 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:13:14.109871 1856079 cri.go:89] found id: ""
	I1124 10:13:14.109879 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:13:14.109934 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:14.117656 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:13:14.117727 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:13:14.169096 1856079 cri.go:89] found id: ""
	I1124 10:13:14.169176 1856079 logs.go:282] 0 containers: []
	W1124 10:13:14.169199 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:13:14.169220 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:13:14.169339 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:13:14.226561 1856079 cri.go:89] found id: ""
	I1124 10:13:14.226643 1856079 logs.go:282] 0 containers: []
	W1124 10:13:14.226667 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:13:14.226714 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:13:14.226747 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:13:14.294267 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:13:14.294348 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:13:14.389690 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:13:14.389707 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:13:14.389719 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:13:14.429541 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:13:14.429620 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:13:14.466682 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:13:14.466765 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:13:14.509234 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:13:14.509312 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:13:14.529523 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:13:14.529677 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:13:14.576942 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:13:14.577013 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:13:14.619729 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:13:14.619817 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:13:17.175013 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:13:17.188187 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:13:17.188259 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:13:17.231727 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:13:17.231751 1856079 cri.go:89] found id: ""
	I1124 10:13:17.231759 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:13:17.231817 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:17.235618 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:13:17.235693 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:13:17.266686 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:13:17.266707 1856079 cri.go:89] found id: ""
	I1124 10:13:17.266768 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:13:17.266858 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:17.270878 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:13:17.270995 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:13:17.308681 1856079 cri.go:89] found id: ""
	I1124 10:13:17.308703 1856079 logs.go:282] 0 containers: []
	W1124 10:13:17.308712 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:13:17.308718 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:13:17.308775 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:13:17.336464 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:13:17.336482 1856079 cri.go:89] found id: ""
	I1124 10:13:17.336490 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:13:17.336546 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:17.340764 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:13:17.340835 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:13:17.372081 1856079 cri.go:89] found id: ""
	I1124 10:13:17.372102 1856079 logs.go:282] 0 containers: []
	W1124 10:13:17.372110 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:13:17.372116 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:13:17.372173 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:13:17.409592 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:13:17.409653 1856079 cri.go:89] found id: ""
	I1124 10:13:17.409676 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:13:17.409764 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:17.413558 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:13:17.413647 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:13:17.455730 1856079 cri.go:89] found id: ""
	I1124 10:13:17.455769 1856079 logs.go:282] 0 containers: []
	W1124 10:13:17.455778 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:13:17.455786 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:13:17.455872 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:13:17.504880 1856079 cri.go:89] found id: ""
	I1124 10:13:17.504909 1856079 logs.go:282] 0 containers: []
	W1124 10:13:17.504918 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:13:17.504932 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:13:17.504952 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:13:17.567353 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:13:17.567392 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:13:17.643348 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:13:17.643369 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:13:17.643382 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:13:17.677602 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:13:17.677634 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:13:17.718932 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:13:17.718964 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:13:17.735688 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:13:17.735718 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:13:17.779153 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:13:17.779188 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:13:17.832606 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:13:17.832642 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:13:17.867528 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:13:17.867563 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:13:20.407806 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:13:20.422805 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:13:20.422883 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:13:20.459411 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:13:20.459437 1856079 cri.go:89] found id: ""
	I1124 10:13:20.459485 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:13:20.459548 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:20.463786 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:13:20.463863 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:13:20.493069 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:13:20.493094 1856079 cri.go:89] found id: ""
	I1124 10:13:20.493102 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:13:20.493156 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:20.496971 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:13:20.497043 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:13:20.532887 1856079 cri.go:89] found id: ""
	I1124 10:13:20.532915 1856079 logs.go:282] 0 containers: []
	W1124 10:13:20.532924 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:13:20.532930 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:13:20.532989 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:13:20.560512 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:13:20.560531 1856079 cri.go:89] found id: ""
	I1124 10:13:20.560539 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:13:20.560593 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:20.564798 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:13:20.564924 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:13:20.600964 1856079 cri.go:89] found id: ""
	I1124 10:13:20.600986 1856079 logs.go:282] 0 containers: []
	W1124 10:13:20.600995 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:13:20.601001 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:13:20.601063 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:13:20.636122 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:13:20.636201 1856079 cri.go:89] found id: ""
	I1124 10:13:20.636223 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:13:20.636318 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:20.640620 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:13:20.640752 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:13:20.688805 1856079 cri.go:89] found id: ""
	I1124 10:13:20.688888 1856079 logs.go:282] 0 containers: []
	W1124 10:13:20.688912 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:13:20.688953 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:13:20.689055 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:13:20.716903 1856079 cri.go:89] found id: ""
	I1124 10:13:20.716975 1856079 logs.go:282] 0 containers: []
	W1124 10:13:20.717011 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:13:20.717043 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:13:20.717089 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:13:20.752107 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:13:20.752179 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:13:20.789739 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:13:20.789813 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:13:20.856018 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:13:20.856096 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:13:21.000076 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:13:21.000096 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:13:21.000108 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:13:21.064882 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:13:21.064958 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:13:21.101474 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:13:21.101555 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:13:21.136036 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:13:21.136117 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:13:21.165824 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:13:21.165906 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:13:23.741452 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:13:23.753578 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:13:23.753646 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:13:23.787104 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:13:23.787129 1856079 cri.go:89] found id: ""
	I1124 10:13:23.787138 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:13:23.787206 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:23.792258 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:13:23.792333 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:13:23.824227 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:13:23.824249 1856079 cri.go:89] found id: ""
	I1124 10:13:23.824257 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:13:23.824326 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:23.829292 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:13:23.829441 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:13:23.863018 1856079 cri.go:89] found id: ""
	I1124 10:13:23.863099 1856079 logs.go:282] 0 containers: []
	W1124 10:13:23.863140 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:13:23.863169 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:13:23.863301 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:13:23.925174 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:13:23.925255 1856079 cri.go:89] found id: ""
	I1124 10:13:23.925277 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:13:23.925374 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:23.930151 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:13:23.930294 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:13:23.996890 1856079 cri.go:89] found id: ""
	I1124 10:13:23.996962 1856079 logs.go:282] 0 containers: []
	W1124 10:13:23.996985 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:13:23.997004 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:13:23.997090 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:13:24.039769 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:13:24.039844 1856079 cri.go:89] found id: ""
	I1124 10:13:24.039866 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:13:24.039958 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:24.043992 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:13:24.044134 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:13:24.076995 1856079 cri.go:89] found id: ""
	I1124 10:13:24.077024 1856079 logs.go:282] 0 containers: []
	W1124 10:13:24.077039 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:13:24.077049 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:13:24.077159 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:13:24.109644 1856079 cri.go:89] found id: ""
	I1124 10:13:24.109666 1856079 logs.go:282] 0 containers: []
	W1124 10:13:24.109674 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:13:24.109688 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:13:24.109698 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:13:24.181399 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:13:24.181437 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:13:24.209323 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:13:24.209362 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:13:24.273524 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:13:24.273608 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:13:24.311824 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:13:24.311908 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:13:24.358079 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:13:24.362609 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:13:24.406471 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:13:24.406549 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:13:24.495390 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:13:24.495456 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:13:24.495495 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:13:24.548395 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:13:24.548485 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:13:27.088875 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:13:27.098960 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:13:27.099028 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:13:27.129494 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:13:27.129515 1856079 cri.go:89] found id: ""
	I1124 10:13:27.129522 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:13:27.129579 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:27.134640 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:13:27.134709 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:13:27.174338 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:13:27.174357 1856079 cri.go:89] found id: ""
	I1124 10:13:27.174366 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:13:27.174423 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:27.178994 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:13:27.179062 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:13:27.212199 1856079 cri.go:89] found id: ""
	I1124 10:13:27.212230 1856079 logs.go:282] 0 containers: []
	W1124 10:13:27.212241 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:13:27.212247 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:13:27.212308 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:13:27.239452 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:13:27.239472 1856079 cri.go:89] found id: ""
	I1124 10:13:27.239481 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:13:27.239561 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:27.243773 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:13:27.243859 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:13:27.293941 1856079 cri.go:89] found id: ""
	I1124 10:13:27.293964 1856079 logs.go:282] 0 containers: []
	W1124 10:13:27.293972 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:13:27.293978 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:13:27.294043 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:13:27.328812 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:13:27.328839 1856079 cri.go:89] found id: ""
	I1124 10:13:27.328857 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:13:27.328915 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:27.333096 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:13:27.333177 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:13:27.360845 1856079 cri.go:89] found id: ""
	I1124 10:13:27.360871 1856079 logs.go:282] 0 containers: []
	W1124 10:13:27.360880 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:13:27.360889 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:13:27.360950 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:13:27.392789 1856079 cri.go:89] found id: ""
	I1124 10:13:27.392817 1856079 logs.go:282] 0 containers: []
	W1124 10:13:27.392827 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:13:27.392839 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:13:27.392851 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:13:27.478048 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:13:27.478070 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:13:27.478083 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:13:27.515812 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:13:27.515845 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:13:27.553071 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:13:27.553105 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:13:27.590648 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:13:27.590685 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:13:27.662770 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:13:27.662800 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:13:27.737000 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:13:27.737034 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:13:27.781536 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:13:27.781702 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:13:27.847507 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:13:27.847589 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:13:30.365046 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:13:30.375558 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:13:30.375625 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:13:30.422302 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:13:30.422321 1856079 cri.go:89] found id: ""
	I1124 10:13:30.422329 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:13:30.422383 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:30.426103 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:13:30.426167 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:13:30.475113 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:13:30.475134 1856079 cri.go:89] found id: ""
	I1124 10:13:30.475143 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:13:30.475210 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:30.482972 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:13:30.483041 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:13:30.523634 1856079 cri.go:89] found id: ""
	I1124 10:13:30.523656 1856079 logs.go:282] 0 containers: []
	W1124 10:13:30.523664 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:13:30.523670 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:13:30.523732 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:13:30.564459 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:13:30.564530 1856079 cri.go:89] found id: ""
	I1124 10:13:30.564553 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:13:30.564650 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:30.568528 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:13:30.568602 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:13:30.616647 1856079 cri.go:89] found id: ""
	I1124 10:13:30.616726 1856079 logs.go:282] 0 containers: []
	W1124 10:13:30.616737 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:13:30.616744 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:13:30.616839 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:13:30.669204 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:13:30.669277 1856079 cri.go:89] found id: ""
	I1124 10:13:30.669299 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:13:30.669395 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:30.690796 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:13:30.690947 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:13:30.754867 1856079 cri.go:89] found id: ""
	I1124 10:13:30.754943 1856079 logs.go:282] 0 containers: []
	W1124 10:13:30.754967 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:13:30.754992 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:13:30.755100 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:13:30.814979 1856079 cri.go:89] found id: ""
	I1124 10:13:30.815052 1856079 logs.go:282] 0 containers: []
	W1124 10:13:30.815075 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:13:30.815102 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:13:30.815140 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:13:30.856162 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:13:30.856237 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:13:30.887453 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:13:30.887534 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:13:31.000419 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:13:31.000490 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:13:31.000523 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:13:31.045100 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:13:31.045176 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:13:31.114426 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:13:31.118592 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:13:31.187246 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:13:31.187319 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:13:31.271954 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:13:31.272033 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:13:31.324694 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:13:31.324780 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:13:33.870141 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:13:33.882970 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:13:33.883042 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:13:33.931904 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:13:33.931923 1856079 cri.go:89] found id: ""
	I1124 10:13:33.931932 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:13:33.931988 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:33.938044 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:13:33.938118 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:13:33.971314 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:13:33.971381 1856079 cri.go:89] found id: ""
	I1124 10:13:33.971405 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:13:33.971507 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:33.975332 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:13:33.975447 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:13:34.016226 1856079 cri.go:89] found id: ""
	I1124 10:13:34.016296 1856079 logs.go:282] 0 containers: []
	W1124 10:13:34.016322 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:13:34.016343 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:13:34.016436 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:13:34.058587 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:13:34.058657 1856079 cri.go:89] found id: ""
	I1124 10:13:34.058682 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:13:34.058777 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:34.065711 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:13:34.065838 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:13:34.126064 1856079 cri.go:89] found id: ""
	I1124 10:13:34.126148 1856079 logs.go:282] 0 containers: []
	W1124 10:13:34.126174 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:13:34.126199 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:13:34.126323 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:13:34.180282 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:13:34.180358 1856079 cri.go:89] found id: ""
	I1124 10:13:34.180382 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:13:34.180474 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:34.184692 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:13:34.184840 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:13:34.238495 1856079 cri.go:89] found id: ""
	I1124 10:13:34.238576 1856079 logs.go:282] 0 containers: []
	W1124 10:13:34.238607 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:13:34.238649 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:13:34.238776 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:13:34.283528 1856079 cri.go:89] found id: ""
	I1124 10:13:34.283608 1856079 logs.go:282] 0 containers: []
	W1124 10:13:34.283632 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:13:34.283678 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:13:34.283710 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:13:34.346177 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:13:34.346256 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:13:34.433231 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:13:34.437645 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:13:34.502399 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:13:34.502504 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:13:34.558995 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:13:34.559075 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:13:34.602873 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:13:34.602947 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:13:34.672475 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:13:34.672554 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:13:34.691123 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:13:34.691265 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:13:34.785657 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:13:34.785732 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:13:34.785761 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:13:37.348455 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:13:37.363377 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:13:37.363447 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:13:37.401692 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:13:37.401713 1856079 cri.go:89] found id: ""
	I1124 10:13:37.401720 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:13:37.401777 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:37.405815 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:13:37.405886 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:13:37.435024 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:13:37.435091 1856079 cri.go:89] found id: ""
	I1124 10:13:37.435118 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:13:37.435221 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:37.439441 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:13:37.439559 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:13:37.470845 1856079 cri.go:89] found id: ""
	I1124 10:13:37.470868 1856079 logs.go:282] 0 containers: []
	W1124 10:13:37.470877 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:13:37.470884 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:13:37.470955 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:13:37.504803 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:13:37.504869 1856079 cri.go:89] found id: ""
	I1124 10:13:37.504896 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:13:37.504986 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:37.509176 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:13:37.509299 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:13:37.538111 1856079 cri.go:89] found id: ""
	I1124 10:13:37.538134 1856079 logs.go:282] 0 containers: []
	W1124 10:13:37.538142 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:13:37.538149 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:13:37.538207 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:13:37.566947 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:13:37.566972 1856079 cri.go:89] found id: ""
	I1124 10:13:37.566981 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:13:37.567037 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:37.571861 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:13:37.571998 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:13:37.599988 1856079 cri.go:89] found id: ""
	I1124 10:13:37.600066 1856079 logs.go:282] 0 containers: []
	W1124 10:13:37.600089 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:13:37.600105 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:13:37.600182 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:13:37.624933 1856079 cri.go:89] found id: ""
	I1124 10:13:37.624959 1856079 logs.go:282] 0 containers: []
	W1124 10:13:37.624968 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:13:37.624981 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:13:37.624992 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:13:37.693850 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:13:37.693891 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:13:37.725760 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:13:37.725797 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:13:37.783396 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:13:37.783432 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:13:37.830186 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:13:37.830243 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:13:37.866925 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:13:37.866965 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:13:37.956555 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:13:37.956580 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:13:37.956594 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:13:37.998344 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:13:37.998380 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:13:38.050398 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:13:38.050435 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:13:40.592543 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:13:40.602447 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:13:40.602547 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:13:40.631370 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:13:40.631395 1856079 cri.go:89] found id: ""
	I1124 10:13:40.631406 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:13:40.631463 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:40.638552 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:13:40.638633 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:13:40.677175 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:13:40.677200 1856079 cri.go:89] found id: ""
	I1124 10:13:40.677210 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:13:40.677269 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:40.682190 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:13:40.682266 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:13:40.724931 1856079 cri.go:89] found id: ""
	I1124 10:13:40.724960 1856079 logs.go:282] 0 containers: []
	W1124 10:13:40.724969 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:13:40.724976 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:13:40.725037 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:13:40.773078 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:13:40.773098 1856079 cri.go:89] found id: ""
	I1124 10:13:40.773106 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:13:40.773167 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:40.777354 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:13:40.777429 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:13:40.821112 1856079 cri.go:89] found id: ""
	I1124 10:13:40.821140 1856079 logs.go:282] 0 containers: []
	W1124 10:13:40.821149 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:13:40.821156 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:13:40.821220 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:13:40.877349 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:13:40.877373 1856079 cri.go:89] found id: ""
	I1124 10:13:40.877381 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:13:40.877437 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:40.881821 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:13:40.881904 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:13:40.909771 1856079 cri.go:89] found id: ""
	I1124 10:13:40.909798 1856079 logs.go:282] 0 containers: []
	W1124 10:13:40.909807 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:13:40.909814 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:13:40.909872 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:13:40.947846 1856079 cri.go:89] found id: ""
	I1124 10:13:40.947871 1856079 logs.go:282] 0 containers: []
	W1124 10:13:40.947879 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:13:40.947895 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:13:40.947912 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:13:41.057116 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:13:41.057140 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:13:41.057154 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:13:41.119014 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:13:41.119055 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:13:41.182157 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:13:41.182194 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:13:41.220466 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:13:41.220504 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:13:41.263134 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:13:41.263165 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:13:41.302262 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:13:41.302298 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:13:41.361116 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:13:41.361156 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:13:41.402482 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:13:41.402513 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:13:43.920614 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:13:43.930961 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:13:43.931032 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:13:43.960736 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:13:43.960755 1856079 cri.go:89] found id: ""
	I1124 10:13:43.960762 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:13:43.960820 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:43.964629 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:13:43.964751 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:13:43.990640 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:13:43.990664 1856079 cri.go:89] found id: ""
	I1124 10:13:43.990673 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:13:43.990729 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:43.994550 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:13:43.994624 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:13:44.023543 1856079 cri.go:89] found id: ""
	I1124 10:13:44.023572 1856079 logs.go:282] 0 containers: []
	W1124 10:13:44.023581 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:13:44.023588 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:13:44.023651 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:13:44.050759 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:13:44.050780 1856079 cri.go:89] found id: ""
	I1124 10:13:44.050788 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:13:44.050849 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:44.055014 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:13:44.055097 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:13:44.081191 1856079 cri.go:89] found id: ""
	I1124 10:13:44.081260 1856079 logs.go:282] 0 containers: []
	W1124 10:13:44.081283 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:13:44.081308 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:13:44.081408 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:13:44.110818 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:13:44.110882 1856079 cri.go:89] found id: ""
	I1124 10:13:44.110909 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:13:44.110991 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:44.114601 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:13:44.114707 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:13:44.139757 1856079 cri.go:89] found id: ""
	I1124 10:13:44.139791 1856079 logs.go:282] 0 containers: []
	W1124 10:13:44.139800 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:13:44.139807 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:13:44.139868 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:13:44.165292 1856079 cri.go:89] found id: ""
	I1124 10:13:44.165364 1856079 logs.go:282] 0 containers: []
	W1124 10:13:44.165388 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:13:44.165416 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:13:44.165456 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:13:44.209773 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:13:44.209812 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:13:44.247900 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:13:44.247933 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:13:44.282272 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:13:44.282308 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:13:44.313889 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:13:44.313919 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:13:44.376967 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:13:44.377006 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:13:44.397713 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:13:44.397745 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:13:44.477918 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:13:44.477940 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:13:44.477952 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:13:44.517182 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:13:44.517215 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:13:47.049049 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:13:47.059505 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:13:47.059580 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:13:47.085569 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:13:47.085592 1856079 cri.go:89] found id: ""
	I1124 10:13:47.085601 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:13:47.085657 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:47.089577 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:13:47.089655 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:13:47.121059 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:13:47.121084 1856079 cri.go:89] found id: ""
	I1124 10:13:47.121093 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:13:47.121150 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:47.124942 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:13:47.125016 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:13:47.156268 1856079 cri.go:89] found id: ""
	I1124 10:13:47.156303 1856079 logs.go:282] 0 containers: []
	W1124 10:13:47.156313 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:13:47.156320 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:13:47.156397 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:13:47.187948 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:13:47.187982 1856079 cri.go:89] found id: ""
	I1124 10:13:47.187991 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:13:47.188048 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:47.191903 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:13:47.191983 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:13:47.217632 1856079 cri.go:89] found id: ""
	I1124 10:13:47.217658 1856079 logs.go:282] 0 containers: []
	W1124 10:13:47.217668 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:13:47.217675 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:13:47.217738 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:13:47.246551 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:13:47.246572 1856079 cri.go:89] found id: ""
	I1124 10:13:47.246580 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:13:47.246642 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:47.250443 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:13:47.250555 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:13:47.293490 1856079 cri.go:89] found id: ""
	I1124 10:13:47.293524 1856079 logs.go:282] 0 containers: []
	W1124 10:13:47.293533 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:13:47.293540 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:13:47.293609 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:13:47.323076 1856079 cri.go:89] found id: ""
	I1124 10:13:47.323102 1856079 logs.go:282] 0 containers: []
	W1124 10:13:47.323111 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:13:47.323125 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:13:47.323138 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:13:47.392999 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:13:47.393078 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:13:47.479173 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:13:47.479194 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:13:47.479228 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:13:47.514509 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:13:47.514549 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:13:47.549121 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:13:47.549152 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:13:47.584470 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:13:47.584506 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:13:47.619915 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:13:47.619943 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:13:47.636951 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:13:47.637036 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:13:47.671605 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:13:47.671640 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:13:50.209766 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:13:50.220149 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:13:50.220247 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:13:50.250516 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:13:50.250537 1856079 cri.go:89] found id: ""
	I1124 10:13:50.250545 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:13:50.250601 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:50.254989 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:13:50.255077 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:13:50.292613 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:13:50.292637 1856079 cri.go:89] found id: ""
	I1124 10:13:50.292645 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:13:50.292704 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:50.296757 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:13:50.296866 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:13:50.324487 1856079 cri.go:89] found id: ""
	I1124 10:13:50.324512 1856079 logs.go:282] 0 containers: []
	W1124 10:13:50.324521 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:13:50.324528 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:13:50.324587 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:13:50.356283 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:13:50.356306 1856079 cri.go:89] found id: ""
	I1124 10:13:50.356314 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:13:50.356390 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:50.360508 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:13:50.360599 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:13:50.403847 1856079 cri.go:89] found id: ""
	I1124 10:13:50.403877 1856079 logs.go:282] 0 containers: []
	W1124 10:13:50.403886 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:13:50.403893 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:13:50.403960 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:13:50.438288 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:13:50.438314 1856079 cri.go:89] found id: ""
	I1124 10:13:50.438322 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:13:50.438388 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:50.442962 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:13:50.443055 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:13:50.476017 1856079 cri.go:89] found id: ""
	I1124 10:13:50.476044 1856079 logs.go:282] 0 containers: []
	W1124 10:13:50.476061 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:13:50.476084 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:13:50.476163 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:13:50.502671 1856079 cri.go:89] found id: ""
	I1124 10:13:50.502697 1856079 logs.go:282] 0 containers: []
	W1124 10:13:50.502707 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:13:50.502721 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:13:50.502732 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:13:50.535906 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:13:50.535938 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:13:50.553553 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:13:50.553590 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:13:50.600094 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:13:50.600132 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:13:50.632675 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:13:50.632706 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:13:50.678394 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:13:50.678429 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:13:50.709568 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:13:50.709598 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:13:50.769663 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:13:50.769700 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:13:50.841371 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:13:50.841393 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:13:50.841407 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:13:53.389547 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:13:53.401188 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:13:53.401304 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:13:53.433595 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:13:53.433625 1856079 cri.go:89] found id: ""
	I1124 10:13:53.433633 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:13:53.433687 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:53.438304 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:13:53.438374 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:13:53.469386 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:13:53.469405 1856079 cri.go:89] found id: ""
	I1124 10:13:53.469413 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:13:53.469470 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:53.473111 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:13:53.473187 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:13:53.497827 1856079 cri.go:89] found id: ""
	I1124 10:13:53.497854 1856079 logs.go:282] 0 containers: []
	W1124 10:13:53.497864 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:13:53.497871 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:13:53.497931 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:13:53.523180 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:13:53.523208 1856079 cri.go:89] found id: ""
	I1124 10:13:53.523217 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:13:53.523288 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:53.526942 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:13:53.527022 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:13:53.553820 1856079 cri.go:89] found id: ""
	I1124 10:13:53.553846 1856079 logs.go:282] 0 containers: []
	W1124 10:13:53.553855 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:13:53.553861 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:13:53.553924 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:13:53.580128 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:13:53.580195 1856079 cri.go:89] found id: ""
	I1124 10:13:53.580218 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:13:53.580306 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:53.583997 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:13:53.584101 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:13:53.608962 1856079 cri.go:89] found id: ""
	I1124 10:13:53.608985 1856079 logs.go:282] 0 containers: []
	W1124 10:13:53.608994 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:13:53.609000 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:13:53.609060 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:13:53.633845 1856079 cri.go:89] found id: ""
	I1124 10:13:53.633869 1856079 logs.go:282] 0 containers: []
	W1124 10:13:53.633877 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:13:53.633911 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:13:53.633933 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:13:53.651334 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:13:53.651366 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:13:53.715933 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:13:53.715956 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:13:53.715969 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:13:53.749995 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:13:53.750045 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:13:53.784516 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:13:53.784545 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:13:53.818542 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:13:53.818579 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:13:53.850329 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:13:53.850361 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:13:53.912426 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:13:53.912463 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:13:53.951361 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:13:53.951390 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:13:56.487605 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:13:56.497501 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:13:56.497570 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:13:56.523428 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:13:56.523453 1856079 cri.go:89] found id: ""
	I1124 10:13:56.523462 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:13:56.523520 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:56.527404 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:13:56.527478 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:13:56.552767 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:13:56.552793 1856079 cri.go:89] found id: ""
	I1124 10:13:56.552801 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:13:56.552859 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:56.556787 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:13:56.556861 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:13:56.584637 1856079 cri.go:89] found id: ""
	I1124 10:13:56.584661 1856079 logs.go:282] 0 containers: []
	W1124 10:13:56.584670 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:13:56.584676 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:13:56.584737 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:13:56.613341 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:13:56.613362 1856079 cri.go:89] found id: ""
	I1124 10:13:56.613369 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:13:56.613436 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:56.617640 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:13:56.617722 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:13:56.643975 1856079 cri.go:89] found id: ""
	I1124 10:13:56.643999 1856079 logs.go:282] 0 containers: []
	W1124 10:13:56.644014 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:13:56.644022 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:13:56.644086 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:13:56.671326 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:13:56.671352 1856079 cri.go:89] found id: ""
	I1124 10:13:56.671362 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:13:56.671420 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:56.675335 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:13:56.675468 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:13:56.700011 1856079 cri.go:89] found id: ""
	I1124 10:13:56.700038 1856079 logs.go:282] 0 containers: []
	W1124 10:13:56.700048 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:13:56.700055 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:13:56.700115 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:13:56.725176 1856079 cri.go:89] found id: ""
	I1124 10:13:56.725203 1856079 logs.go:282] 0 containers: []
	W1124 10:13:56.725213 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:13:56.725226 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:13:56.725238 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:13:56.787213 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:13:56.787234 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:13:56.787246 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:13:56.826356 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:13:56.826385 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:13:56.861994 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:13:56.862026 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:13:56.903380 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:13:56.903413 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:13:56.937947 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:13:56.937984 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:13:56.966819 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:13:56.966850 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:13:57.029061 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:13:57.029100 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:13:57.046804 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:13:57.046836 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:13:59.582729 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:13:59.593313 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:13:59.593385 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:13:59.628031 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:13:59.628056 1856079 cri.go:89] found id: ""
	I1124 10:13:59.628065 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:13:59.628147 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:59.632188 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:13:59.632315 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:13:59.657578 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:13:59.657601 1856079 cri.go:89] found id: ""
	I1124 10:13:59.657610 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:13:59.657666 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:59.661306 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:13:59.661381 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:13:59.687915 1856079 cri.go:89] found id: ""
	I1124 10:13:59.687941 1856079 logs.go:282] 0 containers: []
	W1124 10:13:59.687950 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:13:59.687957 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:13:59.688044 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:13:59.714107 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:13:59.714132 1856079 cri.go:89] found id: ""
	I1124 10:13:59.714140 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:13:59.714198 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:59.718111 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:13:59.718195 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:13:59.744674 1856079 cri.go:89] found id: ""
	I1124 10:13:59.744710 1856079 logs.go:282] 0 containers: []
	W1124 10:13:59.744719 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:13:59.744734 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:13:59.744815 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:13:59.774658 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:13:59.774682 1856079 cri.go:89] found id: ""
	I1124 10:13:59.774691 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:13:59.774752 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:13:59.778586 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:13:59.778695 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:13:59.805443 1856079 cri.go:89] found id: ""
	I1124 10:13:59.805478 1856079 logs.go:282] 0 containers: []
	W1124 10:13:59.805487 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:13:59.805494 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:13:59.805594 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:13:59.832989 1856079 cri.go:89] found id: ""
	I1124 10:13:59.833016 1856079 logs.go:282] 0 containers: []
	W1124 10:13:59.833032 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:13:59.833071 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:13:59.833091 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:13:59.866397 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:13:59.866431 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:13:59.883583 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:13:59.883614 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:13:59.956263 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:13:59.956285 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:13:59.956298 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:13:59.988190 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:13:59.988219 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:14:00.123074 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:14:00.123173 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:14:00.303797 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:14:00.303852 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:14:00.390748 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:14:00.390841 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:14:00.458784 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:14:00.458822 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:14:02.998695 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:14:03.011877 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:14:03.011964 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:14:03.041329 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:14:03.041356 1856079 cri.go:89] found id: ""
	I1124 10:14:03.041365 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:14:03.041432 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:03.046166 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:14:03.046255 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:14:03.078060 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:14:03.078086 1856079 cri.go:89] found id: ""
	I1124 10:14:03.078095 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:14:03.078160 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:03.082405 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:14:03.082543 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:14:03.110562 1856079 cri.go:89] found id: ""
	I1124 10:14:03.110592 1856079 logs.go:282] 0 containers: []
	W1124 10:14:03.110602 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:14:03.110613 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:14:03.110696 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:14:03.139773 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:14:03.139798 1856079 cri.go:89] found id: ""
	I1124 10:14:03.139806 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:14:03.139888 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:03.143783 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:14:03.143863 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:14:03.180446 1856079 cri.go:89] found id: ""
	I1124 10:14:03.180473 1856079 logs.go:282] 0 containers: []
	W1124 10:14:03.180483 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:14:03.180490 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:14:03.180547 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:14:03.207925 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:14:03.207956 1856079 cri.go:89] found id: ""
	I1124 10:14:03.207965 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:14:03.208022 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:03.211801 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:14:03.211872 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:14:03.241178 1856079 cri.go:89] found id: ""
	I1124 10:14:03.241201 1856079 logs.go:282] 0 containers: []
	W1124 10:14:03.241210 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:14:03.241216 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:14:03.241284 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:14:03.267096 1856079 cri.go:89] found id: ""
	I1124 10:14:03.267122 1856079 logs.go:282] 0 containers: []
	W1124 10:14:03.267133 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:14:03.267162 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:14:03.267180 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:14:03.298997 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:14:03.299024 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:14:03.361122 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:14:03.361159 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:14:03.381628 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:14:03.381660 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:14:03.423046 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:14:03.423078 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:14:03.465938 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:14:03.465969 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:14:03.538545 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:14:03.538608 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:14:03.538644 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:14:03.573730 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:14:03.573804 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:14:03.607615 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:14:03.607646 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:14:06.142411 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:14:06.153950 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:14:06.154032 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:14:06.191652 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:14:06.191677 1856079 cri.go:89] found id: ""
	I1124 10:14:06.191686 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:14:06.191744 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:06.195423 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:14:06.195496 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:14:06.220392 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:14:06.220415 1856079 cri.go:89] found id: ""
	I1124 10:14:06.220423 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:14:06.220477 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:06.224040 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:14:06.224107 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:14:06.247347 1856079 cri.go:89] found id: ""
	I1124 10:14:06.247374 1856079 logs.go:282] 0 containers: []
	W1124 10:14:06.247383 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:14:06.247389 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:14:06.247452 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:14:06.272036 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:14:06.272057 1856079 cri.go:89] found id: ""
	I1124 10:14:06.272065 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:14:06.272122 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:06.275818 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:14:06.275915 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:14:06.301419 1856079 cri.go:89] found id: ""
	I1124 10:14:06.301445 1856079 logs.go:282] 0 containers: []
	W1124 10:14:06.301454 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:14:06.301461 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:14:06.301521 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:14:06.327013 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:14:06.327093 1856079 cri.go:89] found id: ""
	I1124 10:14:06.327109 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:14:06.327169 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:06.330928 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:14:06.330999 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:14:06.358042 1856079 cri.go:89] found id: ""
	I1124 10:14:06.358108 1856079 logs.go:282] 0 containers: []
	W1124 10:14:06.358132 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:14:06.358152 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:14:06.358240 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:14:06.394677 1856079 cri.go:89] found id: ""
	I1124 10:14:06.394755 1856079 logs.go:282] 0 containers: []
	W1124 10:14:06.394785 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:14:06.394816 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:14:06.394841 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:14:06.412562 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:14:06.412645 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:14:06.455160 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:14:06.455193 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:14:06.490108 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:14:06.490140 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:14:06.525953 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:14:06.526026 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:14:06.589393 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:14:06.589430 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:14:06.654302 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:14:06.654322 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:14:06.654335 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:14:06.693851 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:14:06.693885 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:14:06.735113 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:14:06.735143 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:14:09.266706 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:14:09.277251 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:14:09.277328 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:14:09.302914 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:14:09.302940 1856079 cri.go:89] found id: ""
	I1124 10:14:09.302949 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:14:09.303005 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:09.306882 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:14:09.306968 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:14:09.334649 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:14:09.334677 1856079 cri.go:89] found id: ""
	I1124 10:14:09.334685 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:14:09.334751 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:09.338507 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:14:09.338583 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:14:09.364398 1856079 cri.go:89] found id: ""
	I1124 10:14:09.364423 1856079 logs.go:282] 0 containers: []
	W1124 10:14:09.364432 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:14:09.364445 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:14:09.364505 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:14:09.399382 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:14:09.399408 1856079 cri.go:89] found id: ""
	I1124 10:14:09.399416 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:14:09.399472 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:09.406327 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:14:09.406398 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:14:09.457279 1856079 cri.go:89] found id: ""
	I1124 10:14:09.457306 1856079 logs.go:282] 0 containers: []
	W1124 10:14:09.457317 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:14:09.457324 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:14:09.457385 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:14:09.483573 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:14:09.483597 1856079 cri.go:89] found id: ""
	I1124 10:14:09.483605 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:14:09.483667 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:09.487441 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:14:09.487516 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:14:09.512911 1856079 cri.go:89] found id: ""
	I1124 10:14:09.512934 1856079 logs.go:282] 0 containers: []
	W1124 10:14:09.512942 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:14:09.512949 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:14:09.513009 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:14:09.539493 1856079 cri.go:89] found id: ""
	I1124 10:14:09.539519 1856079 logs.go:282] 0 containers: []
	W1124 10:14:09.539528 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:14:09.539548 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:14:09.539560 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:14:09.609157 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:14:09.609178 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:14:09.609192 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:14:09.644019 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:14:09.644050 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:14:09.673128 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:14:09.673160 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:14:09.738218 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:14:09.738269 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:14:09.787783 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:14:09.787816 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:14:09.823393 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:14:09.823429 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:14:09.868161 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:14:09.868192 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:14:09.900929 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:14:09.900963 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:14:12.418895 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:14:12.430873 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:14:12.430960 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:14:12.463358 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:14:12.463383 1856079 cri.go:89] found id: ""
	I1124 10:14:12.463393 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:14:12.463453 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:12.467389 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:14:12.467474 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:14:12.494149 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:14:12.494172 1856079 cri.go:89] found id: ""
	I1124 10:14:12.494180 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:14:12.494241 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:12.498151 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:14:12.498258 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:14:12.523387 1856079 cri.go:89] found id: ""
	I1124 10:14:12.523412 1856079 logs.go:282] 0 containers: []
	W1124 10:14:12.523422 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:14:12.523428 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:14:12.523487 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:14:12.547988 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:14:12.548008 1856079 cri.go:89] found id: ""
	I1124 10:14:12.548016 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:14:12.548125 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:12.552055 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:14:12.552129 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:14:12.577251 1856079 cri.go:89] found id: ""
	I1124 10:14:12.577278 1856079 logs.go:282] 0 containers: []
	W1124 10:14:12.577288 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:14:12.577295 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:14:12.577354 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:14:12.607721 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:14:12.607744 1856079 cri.go:89] found id: ""
	I1124 10:14:12.607755 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:14:12.607814 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:12.611694 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:14:12.611764 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:14:12.639868 1856079 cri.go:89] found id: ""
	I1124 10:14:12.639894 1856079 logs.go:282] 0 containers: []
	W1124 10:14:12.639904 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:14:12.639911 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:14:12.639976 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:14:12.666035 1856079 cri.go:89] found id: ""
	I1124 10:14:12.666061 1856079 logs.go:282] 0 containers: []
	W1124 10:14:12.666070 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:14:12.666085 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:14:12.666096 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:14:12.725257 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:14:12.725293 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:14:12.741566 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:14:12.741595 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:14:12.813292 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:14:12.813314 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:14:12.813328 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:14:12.847216 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:14:12.847314 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:14:12.886856 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:14:12.886890 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:14:12.924685 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:14:12.924717 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:14:12.959061 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:14:12.959101 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:14:12.990511 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:14:12.990547 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:14:15.528663 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:14:15.538772 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:14:15.538847 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:14:15.564795 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:14:15.564856 1856079 cri.go:89] found id: ""
	I1124 10:14:15.564888 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:14:15.564953 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:15.568852 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:14:15.568927 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:14:15.597914 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:14:15.597938 1856079 cri.go:89] found id: ""
	I1124 10:14:15.597947 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:14:15.598004 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:15.601842 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:14:15.601917 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:14:15.629011 1856079 cri.go:89] found id: ""
	I1124 10:14:15.629038 1856079 logs.go:282] 0 containers: []
	W1124 10:14:15.629048 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:14:15.629055 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:14:15.629118 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:14:15.654422 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:14:15.654446 1856079 cri.go:89] found id: ""
	I1124 10:14:15.654505 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:14:15.654569 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:15.658534 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:14:15.658605 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:14:15.685760 1856079 cri.go:89] found id: ""
	I1124 10:14:15.685836 1856079 logs.go:282] 0 containers: []
	W1124 10:14:15.685878 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:14:15.685900 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:14:15.685994 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:14:15.711010 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:14:15.711034 1856079 cri.go:89] found id: ""
	I1124 10:14:15.711042 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:14:15.711118 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:15.714790 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:14:15.714862 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:14:15.740308 1856079 cri.go:89] found id: ""
	I1124 10:14:15.740334 1856079 logs.go:282] 0 containers: []
	W1124 10:14:15.740343 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:14:15.740350 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:14:15.740411 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:14:15.771518 1856079 cri.go:89] found id: ""
	I1124 10:14:15.771542 1856079 logs.go:282] 0 containers: []
	W1124 10:14:15.771552 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:14:15.771567 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:14:15.771580 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:14:15.849473 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:14:15.849496 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:14:15.849509 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:14:15.889416 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:14:15.889453 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:14:15.925560 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:14:15.925592 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:14:15.959013 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:14:15.959050 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:14:16.022964 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:14:16.023002 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:14:16.040479 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:14:16.040514 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:14:16.077449 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:14:16.077485 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:14:16.117106 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:14:16.117138 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:14:18.657024 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:14:18.667446 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:14:18.667521 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:14:18.692937 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:14:18.692960 1856079 cri.go:89] found id: ""
	I1124 10:14:18.692968 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:14:18.693026 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:18.696717 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:14:18.696792 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:14:18.722711 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:14:18.722735 1856079 cri.go:89] found id: ""
	I1124 10:14:18.722744 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:14:18.722802 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:18.726641 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:14:18.726721 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:14:18.752071 1856079 cri.go:89] found id: ""
	I1124 10:14:18.752098 1856079 logs.go:282] 0 containers: []
	W1124 10:14:18.752107 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:14:18.752114 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:14:18.752177 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:14:18.780218 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:14:18.780241 1856079 cri.go:89] found id: ""
	I1124 10:14:18.780250 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:14:18.780320 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:18.784073 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:14:18.784148 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:14:18.813071 1856079 cri.go:89] found id: ""
	I1124 10:14:18.813095 1856079 logs.go:282] 0 containers: []
	W1124 10:14:18.813105 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:14:18.813112 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:14:18.813178 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:14:18.840910 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:14:18.840934 1856079 cri.go:89] found id: ""
	I1124 10:14:18.840942 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:14:18.840998 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:18.845692 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:14:18.845765 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:14:18.870347 1856079 cri.go:89] found id: ""
	I1124 10:14:18.870378 1856079 logs.go:282] 0 containers: []
	W1124 10:14:18.870388 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:14:18.870395 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:14:18.870484 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:14:18.894941 1856079 cri.go:89] found id: ""
	I1124 10:14:18.894963 1856079 logs.go:282] 0 containers: []
	W1124 10:14:18.894972 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:14:18.894987 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:14:18.894997 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:14:18.953035 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:14:18.953070 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:14:18.971105 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:14:18.971136 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:14:19.014125 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:14:19.014159 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:14:19.055195 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:14:19.055229 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:14:19.089802 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:14:19.089838 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:14:19.122132 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:14:19.122166 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:14:19.221350 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:14:19.221373 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:14:19.221387 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:14:19.256913 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:14:19.256950 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:14:21.785660 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:14:21.795790 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:14:21.795860 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:14:21.821757 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:14:21.821778 1856079 cri.go:89] found id: ""
	I1124 10:14:21.821787 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:14:21.821846 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:21.825780 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:14:21.825855 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:14:21.858932 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:14:21.858966 1856079 cri.go:89] found id: ""
	I1124 10:14:21.858976 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:14:21.859037 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:21.863074 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:14:21.863159 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:14:21.889502 1856079 cri.go:89] found id: ""
	I1124 10:14:21.889527 1856079 logs.go:282] 0 containers: []
	W1124 10:14:21.889537 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:14:21.889548 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:14:21.889615 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:14:21.921335 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:14:21.921359 1856079 cri.go:89] found id: ""
	I1124 10:14:21.921367 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:14:21.921429 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:21.925458 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:14:21.925533 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:14:21.950368 1856079 cri.go:89] found id: ""
	I1124 10:14:21.950394 1856079 logs.go:282] 0 containers: []
	W1124 10:14:21.950404 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:14:21.950410 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:14:21.950534 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:14:21.977523 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:14:21.977543 1856079 cri.go:89] found id: ""
	I1124 10:14:21.977567 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:14:21.977638 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:21.981969 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:14:21.982104 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:14:22.015062 1856079 cri.go:89] found id: ""
	I1124 10:14:22.015093 1856079 logs.go:282] 0 containers: []
	W1124 10:14:22.015103 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:14:22.015111 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:14:22.015182 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:14:22.043555 1856079 cri.go:89] found id: ""
	I1124 10:14:22.043584 1856079 logs.go:282] 0 containers: []
	W1124 10:14:22.043595 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:14:22.043610 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:14:22.043622 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:14:22.114439 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:14:22.114485 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:14:22.114500 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:14:22.173140 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:14:22.173178 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:14:22.219352 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:14:22.219392 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:14:22.252370 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:14:22.252401 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:14:22.286628 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:14:22.286662 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:14:22.316445 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:14:22.316475 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:14:22.333476 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:14:22.333506 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:14:22.387446 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:14:22.387481 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:14:24.952571 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:14:24.962537 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:14:24.962607 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:14:24.987444 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:14:24.987475 1856079 cri.go:89] found id: ""
	I1124 10:14:24.987483 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:14:24.987539 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:24.991329 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:14:24.991422 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:14:25.025631 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:14:25.025720 1856079 cri.go:89] found id: ""
	I1124 10:14:25.025743 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:14:25.025830 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:25.029773 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:14:25.029871 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:14:25.058494 1856079 cri.go:89] found id: ""
	I1124 10:14:25.058567 1856079 logs.go:282] 0 containers: []
	W1124 10:14:25.058581 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:14:25.058589 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:14:25.058664 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:14:25.088540 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:14:25.088575 1856079 cri.go:89] found id: ""
	I1124 10:14:25.088584 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:14:25.088652 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:25.092618 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:14:25.092719 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:14:25.118444 1856079 cri.go:89] found id: ""
	I1124 10:14:25.118498 1856079 logs.go:282] 0 containers: []
	W1124 10:14:25.118507 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:14:25.118523 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:14:25.118600 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:14:25.166742 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:14:25.166767 1856079 cri.go:89] found id: ""
	I1124 10:14:25.166776 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:14:25.166835 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:25.172567 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:14:25.172646 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:14:25.210035 1856079 cri.go:89] found id: ""
	I1124 10:14:25.210061 1856079 logs.go:282] 0 containers: []
	W1124 10:14:25.210070 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:14:25.210077 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:14:25.210143 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:14:25.239861 1856079 cri.go:89] found id: ""
	I1124 10:14:25.239886 1856079 logs.go:282] 0 containers: []
	W1124 10:14:25.239896 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:14:25.239910 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:14:25.239951 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:14:25.281910 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:14:25.281944 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:14:25.315004 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:14:25.315039 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:14:25.354176 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:14:25.354209 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:14:25.388012 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:14:25.388040 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:14:25.445807 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:14:25.445844 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:14:25.513057 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:14:25.513076 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:14:25.513088 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:14:25.545603 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:14:25.545638 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:14:25.562175 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:14:25.562206 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:14:28.097353 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:14:28.110601 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:14:28.110670 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:14:28.150638 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:14:28.150657 1856079 cri.go:89] found id: ""
	I1124 10:14:28.150666 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:14:28.150722 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:28.155895 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:14:28.155964 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:14:28.220685 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:14:28.220706 1856079 cri.go:89] found id: ""
	I1124 10:14:28.220713 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:14:28.220770 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:28.231190 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:14:28.231275 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:14:28.301688 1856079 cri.go:89] found id: ""
	I1124 10:14:28.301710 1856079 logs.go:282] 0 containers: []
	W1124 10:14:28.301719 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:14:28.301725 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:14:28.301792 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:14:28.328179 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:14:28.328199 1856079 cri.go:89] found id: ""
	I1124 10:14:28.328206 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:14:28.328261 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:28.332525 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:14:28.332593 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:14:28.360578 1856079 cri.go:89] found id: ""
	I1124 10:14:28.360601 1856079 logs.go:282] 0 containers: []
	W1124 10:14:28.360612 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:14:28.360619 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:14:28.360681 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:14:28.390671 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:14:28.390745 1856079 cri.go:89] found id: ""
	I1124 10:14:28.390784 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:14:28.390878 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:28.395241 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:14:28.395367 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:14:28.428513 1856079 cri.go:89] found id: ""
	I1124 10:14:28.428592 1856079 logs.go:282] 0 containers: []
	W1124 10:14:28.428629 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:14:28.428656 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:14:28.428752 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:14:28.464084 1856079 cri.go:89] found id: ""
	I1124 10:14:28.464165 1856079 logs.go:282] 0 containers: []
	W1124 10:14:28.464189 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:14:28.464235 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:14:28.464266 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:14:28.501015 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:14:28.505460 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:14:28.569531 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:14:28.569607 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:14:28.609181 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:14:28.609258 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:14:28.670959 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:14:28.670991 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:14:28.720447 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:14:28.720490 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:14:28.795025 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:14:28.795062 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:14:28.812760 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:14:28.812792 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:14:28.898668 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:14:28.898691 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:14:28.898704 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:14:31.437610 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:14:31.453046 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:14:31.453113 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:14:31.486870 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:14:31.486907 1856079 cri.go:89] found id: ""
	I1124 10:14:31.486916 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:14:31.486978 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:31.491282 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:14:31.491368 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:14:31.520851 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:14:31.520877 1856079 cri.go:89] found id: ""
	I1124 10:14:31.520886 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:14:31.520945 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:31.525316 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:14:31.525395 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:14:31.557685 1856079 cri.go:89] found id: ""
	I1124 10:14:31.557713 1856079 logs.go:282] 0 containers: []
	W1124 10:14:31.557723 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:14:31.557730 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:14:31.557797 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:14:31.590717 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:14:31.590744 1856079 cri.go:89] found id: ""
	I1124 10:14:31.590752 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:14:31.590809 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:31.595159 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:14:31.595254 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:14:31.626010 1856079 cri.go:89] found id: ""
	I1124 10:14:31.626038 1856079 logs.go:282] 0 containers: []
	W1124 10:14:31.626047 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:14:31.626055 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:14:31.626119 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:14:31.656686 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:14:31.656802 1856079 cri.go:89] found id: ""
	I1124 10:14:31.656811 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:14:31.656879 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:31.661721 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:14:31.661800 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:14:31.696712 1856079 cri.go:89] found id: ""
	I1124 10:14:31.696741 1856079 logs.go:282] 0 containers: []
	W1124 10:14:31.696750 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:14:31.696758 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:14:31.696820 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:14:31.735237 1856079 cri.go:89] found id: ""
	I1124 10:14:31.735264 1856079 logs.go:282] 0 containers: []
	W1124 10:14:31.735274 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:14:31.735288 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:14:31.735299 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:14:31.804751 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:14:31.804829 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:14:31.823832 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:14:31.823905 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:14:31.918242 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:14:31.918301 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:14:31.918331 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:14:31.959523 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:14:31.959599 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:14:32.019736 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:14:32.019902 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:14:32.081590 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:14:32.081632 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:14:32.124772 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:14:32.124813 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:14:32.198013 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:14:32.198046 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:14:34.766697 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:14:34.777965 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:14:34.778040 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:14:34.834914 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:14:34.834934 1856079 cri.go:89] found id: ""
	I1124 10:14:34.834942 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:14:34.834997 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:34.838842 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:14:34.838908 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:14:34.868933 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:14:34.868952 1856079 cri.go:89] found id: ""
	I1124 10:14:34.868961 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:14:34.869017 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:34.872948 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:14:34.873021 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:14:34.906599 1856079 cri.go:89] found id: ""
	I1124 10:14:34.906619 1856079 logs.go:282] 0 containers: []
	W1124 10:14:34.906628 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:14:34.906634 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:14:34.906688 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:14:34.954872 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:14:34.954892 1856079 cri.go:89] found id: ""
	I1124 10:14:34.954899 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:14:34.954960 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:34.959314 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:14:34.959394 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:14:34.994310 1856079 cri.go:89] found id: ""
	I1124 10:14:34.994332 1856079 logs.go:282] 0 containers: []
	W1124 10:14:34.994340 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:14:34.994347 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:14:34.994409 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:14:35.027372 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:14:35.027393 1856079 cri.go:89] found id: ""
	I1124 10:14:35.027402 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:14:35.027470 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:35.033724 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:14:35.033802 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:14:35.061058 1856079 cri.go:89] found id: ""
	I1124 10:14:35.061080 1856079 logs.go:282] 0 containers: []
	W1124 10:14:35.061089 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:14:35.061095 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:14:35.061155 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:14:35.102056 1856079 cri.go:89] found id: ""
	I1124 10:14:35.102142 1856079 logs.go:282] 0 containers: []
	W1124 10:14:35.102166 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:14:35.102215 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:14:35.102250 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:14:35.169048 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:14:35.169080 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:14:35.232437 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:14:35.232509 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:14:35.303143 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:14:35.303254 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:14:35.328086 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:14:35.328229 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:14:35.389857 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:14:35.390009 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:14:35.450114 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:14:35.450152 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:14:35.497167 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:14:35.498518 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:14:35.564675 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:14:35.564709 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:14:35.643842 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:14:38.144090 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:14:38.154614 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:14:38.154689 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:14:38.186287 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:14:38.186313 1856079 cri.go:89] found id: ""
	I1124 10:14:38.186321 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:14:38.186379 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:38.190496 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:14:38.190576 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:14:38.216917 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:14:38.216945 1856079 cri.go:89] found id: ""
	I1124 10:14:38.216955 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:14:38.217013 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:38.221028 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:14:38.221107 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:14:38.246446 1856079 cri.go:89] found id: ""
	I1124 10:14:38.246517 1856079 logs.go:282] 0 containers: []
	W1124 10:14:38.246527 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:14:38.246534 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:14:38.246599 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:14:38.273457 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:14:38.273482 1856079 cri.go:89] found id: ""
	I1124 10:14:38.273489 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:14:38.273545 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:38.277257 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:14:38.277331 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:14:38.302532 1856079 cri.go:89] found id: ""
	I1124 10:14:38.302561 1856079 logs.go:282] 0 containers: []
	W1124 10:14:38.302571 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:14:38.302578 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:14:38.302645 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:14:38.331323 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:14:38.331389 1856079 cri.go:89] found id: ""
	I1124 10:14:38.331413 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:14:38.331506 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:38.335065 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:14:38.335139 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:14:38.360676 1856079 cri.go:89] found id: ""
	I1124 10:14:38.360701 1856079 logs.go:282] 0 containers: []
	W1124 10:14:38.360710 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:14:38.360716 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:14:38.360777 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:14:38.385346 1856079 cri.go:89] found id: ""
	I1124 10:14:38.385368 1856079 logs.go:282] 0 containers: []
	W1124 10:14:38.385376 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:14:38.385389 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:14:38.385401 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:14:38.449977 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:14:38.449995 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:14:38.450007 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:14:38.486738 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:14:38.486770 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:14:38.527781 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:14:38.527818 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:14:38.560089 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:14:38.560125 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:14:38.590409 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:14:38.590437 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:14:38.607076 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:14:38.607105 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:14:38.641103 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:14:38.641136 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:14:38.672368 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:14:38.672401 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:14:41.231328 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:14:41.247527 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:14:41.247629 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:14:41.287290 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:14:41.287314 1856079 cri.go:89] found id: ""
	I1124 10:14:41.287323 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:14:41.287379 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:41.291132 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:14:41.291207 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:14:41.337208 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:14:41.337227 1856079 cri.go:89] found id: ""
	I1124 10:14:41.337235 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:14:41.337287 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:41.344071 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:14:41.344139 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:14:41.388346 1856079 cri.go:89] found id: ""
	I1124 10:14:41.388375 1856079 logs.go:282] 0 containers: []
	W1124 10:14:41.388384 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:14:41.388391 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:14:41.391558 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:14:41.441291 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:14:41.441357 1856079 cri.go:89] found id: ""
	I1124 10:14:41.441382 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:14:41.441467 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:41.445310 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:14:41.445413 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:14:41.474810 1856079 cri.go:89] found id: ""
	I1124 10:14:41.474836 1856079 logs.go:282] 0 containers: []
	W1124 10:14:41.474845 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:14:41.474852 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:14:41.474964 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:14:41.505532 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:14:41.505559 1856079 cri.go:89] found id: ""
	I1124 10:14:41.505567 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:14:41.505662 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:41.509894 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:14:41.510009 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:14:41.536957 1856079 cri.go:89] found id: ""
	I1124 10:14:41.536983 1856079 logs.go:282] 0 containers: []
	W1124 10:14:41.536993 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:14:41.536999 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:14:41.537114 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:14:41.571806 1856079 cri.go:89] found id: ""
	I1124 10:14:41.571833 1856079 logs.go:282] 0 containers: []
	W1124 10:14:41.571843 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:14:41.571890 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:14:41.571909 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:14:41.634804 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:14:41.634841 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:14:41.652998 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:14:41.653027 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:14:41.731604 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:14:41.731630 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:14:41.731643 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:14:41.766375 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:14:41.766409 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:14:41.814632 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:14:41.814667 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:14:41.873997 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:14:41.874027 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:14:41.944471 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:14:41.944507 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:14:42.026441 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:14:42.026501 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:14:44.572827 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:14:44.582973 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:14:44.583048 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:14:44.610536 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:14:44.610561 1856079 cri.go:89] found id: ""
	I1124 10:14:44.610569 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:14:44.610627 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:44.614563 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:14:44.614642 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:14:44.641852 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:14:44.641890 1856079 cri.go:89] found id: ""
	I1124 10:14:44.641899 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:14:44.641957 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:44.645507 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:14:44.645595 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:14:44.672508 1856079 cri.go:89] found id: ""
	I1124 10:14:44.672581 1856079 logs.go:282] 0 containers: []
	W1124 10:14:44.672598 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:14:44.672605 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:14:44.672682 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:14:44.698374 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:14:44.698402 1856079 cri.go:89] found id: ""
	I1124 10:14:44.698409 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:14:44.698509 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:44.702333 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:14:44.702414 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:14:44.728910 1856079 cri.go:89] found id: ""
	I1124 10:14:44.728936 1856079 logs.go:282] 0 containers: []
	W1124 10:14:44.728945 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:14:44.728952 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:14:44.729009 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:14:44.754725 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:14:44.754747 1856079 cri.go:89] found id: ""
	I1124 10:14:44.754756 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:14:44.754810 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:44.758697 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:14:44.758767 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:14:44.783334 1856079 cri.go:89] found id: ""
	I1124 10:14:44.783363 1856079 logs.go:282] 0 containers: []
	W1124 10:14:44.783373 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:14:44.783380 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:14:44.783438 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:14:44.808791 1856079 cri.go:89] found id: ""
	I1124 10:14:44.808864 1856079 logs.go:282] 0 containers: []
	W1124 10:14:44.808887 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:14:44.808936 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:14:44.808965 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:14:44.826502 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:14:44.826531 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:14:44.868558 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:14:44.868596 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:14:44.908274 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:14:44.908329 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:14:44.950958 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:14:44.950984 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:14:45.011678 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:14:45.011727 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:14:45.113993 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:14:45.114083 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:14:45.114105 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:14:45.163595 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:14:45.163638 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:14:45.213550 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:14:45.213597 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:14:47.767103 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:14:47.777038 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:14:47.777111 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:14:47.801655 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:14:47.801675 1856079 cri.go:89] found id: ""
	I1124 10:14:47.801682 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:14:47.801737 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:47.805239 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:14:47.805313 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:14:47.829839 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:14:47.829863 1856079 cri.go:89] found id: ""
	I1124 10:14:47.829871 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:14:47.829929 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:47.833606 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:14:47.833684 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:14:47.858409 1856079 cri.go:89] found id: ""
	I1124 10:14:47.858432 1856079 logs.go:282] 0 containers: []
	W1124 10:14:47.858440 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:14:47.858447 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:14:47.858539 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:14:47.892070 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:14:47.892097 1856079 cri.go:89] found id: ""
	I1124 10:14:47.892105 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:14:47.892164 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:47.896290 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:14:47.896370 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:14:47.921839 1856079 cri.go:89] found id: ""
	I1124 10:14:47.921867 1856079 logs.go:282] 0 containers: []
	W1124 10:14:47.921876 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:14:47.921882 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:14:47.921943 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:14:47.956300 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:14:47.956329 1856079 cri.go:89] found id: ""
	I1124 10:14:47.956339 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:14:47.956396 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:47.960499 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:14:47.960604 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:14:47.990552 1856079 cri.go:89] found id: ""
	I1124 10:14:47.990574 1856079 logs.go:282] 0 containers: []
	W1124 10:14:47.990583 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:14:47.990618 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:14:47.990711 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:14:48.018049 1856079 cri.go:89] found id: ""
	I1124 10:14:48.018075 1856079 logs.go:282] 0 containers: []
	W1124 10:14:48.018085 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:14:48.018100 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:14:48.018114 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:14:48.058074 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:14:48.058109 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:14:48.093381 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:14:48.093424 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:14:48.126567 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:14:48.126601 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:14:48.143598 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:14:48.143628 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:14:48.175833 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:14:48.175866 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:14:48.203925 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:14:48.203955 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:14:48.262115 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:14:48.262156 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:14:48.332570 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:14:48.332644 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:14:48.332664 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:14:50.870582 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:14:50.894386 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:14:50.894478 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:14:50.977766 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:14:50.977801 1856079 cri.go:89] found id: ""
	I1124 10:14:50.977810 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:14:50.977863 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:50.981974 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:14:50.982049 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:14:51.012888 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:14:51.012930 1856079 cri.go:89] found id: ""
	I1124 10:14:51.012940 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:14:51.013011 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:51.017430 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:14:51.017520 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:14:51.045818 1856079 cri.go:89] found id: ""
	I1124 10:14:51.045842 1856079 logs.go:282] 0 containers: []
	W1124 10:14:51.045850 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:14:51.045857 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:14:51.045920 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:14:51.084256 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:14:51.084280 1856079 cri.go:89] found id: ""
	I1124 10:14:51.084289 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:14:51.084352 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:51.095091 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:14:51.095169 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:14:51.137204 1856079 cri.go:89] found id: ""
	I1124 10:14:51.137233 1856079 logs.go:282] 0 containers: []
	W1124 10:14:51.137244 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:14:51.137251 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:14:51.137312 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:14:51.182000 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:14:51.182039 1856079 cri.go:89] found id: ""
	I1124 10:14:51.182049 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:14:51.182116 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:51.187773 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:14:51.187860 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:14:51.218939 1856079 cri.go:89] found id: ""
	I1124 10:14:51.218963 1856079 logs.go:282] 0 containers: []
	W1124 10:14:51.218992 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:14:51.219000 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:14:51.219076 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:14:51.245653 1856079 cri.go:89] found id: ""
	I1124 10:14:51.245695 1856079 logs.go:282] 0 containers: []
	W1124 10:14:51.245704 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:14:51.245717 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:14:51.245728 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:14:51.342693 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:14:51.342715 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:14:51.342731 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:14:51.396953 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:14:51.396990 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:14:51.464503 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:14:51.464542 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:14:51.513373 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:14:51.513404 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:14:51.554205 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:14:51.554481 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:14:51.599417 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:14:51.599442 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:14:51.690120 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:14:51.690218 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:14:51.728470 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:14:51.728505 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:14:54.250407 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:14:54.260670 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:14:54.260741 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:14:54.287767 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:14:54.287792 1856079 cri.go:89] found id: ""
	I1124 10:14:54.287801 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:14:54.287854 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:54.292127 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:14:54.292207 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:14:54.321522 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:14:54.321547 1856079 cri.go:89] found id: ""
	I1124 10:14:54.321555 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:14:54.321612 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:54.325886 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:14:54.325960 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:14:54.353257 1856079 cri.go:89] found id: ""
	I1124 10:14:54.353284 1856079 logs.go:282] 0 containers: []
	W1124 10:14:54.353293 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:14:54.353300 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:14:54.353359 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:14:54.384645 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:14:54.384671 1856079 cri.go:89] found id: ""
	I1124 10:14:54.384682 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:14:54.384743 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:54.388889 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:14:54.388971 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:14:54.423836 1856079 cri.go:89] found id: ""
	I1124 10:14:54.423864 1856079 logs.go:282] 0 containers: []
	W1124 10:14:54.423874 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:14:54.423887 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:14:54.423951 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:14:54.452589 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:14:54.452614 1856079 cri.go:89] found id: ""
	I1124 10:14:54.452622 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:14:54.452678 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:54.456992 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:14:54.457078 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:14:54.485421 1856079 cri.go:89] found id: ""
	I1124 10:14:54.485449 1856079 logs.go:282] 0 containers: []
	W1124 10:14:54.485458 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:14:54.485465 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:14:54.485532 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:14:54.520946 1856079 cri.go:89] found id: ""
	I1124 10:14:54.520973 1856079 logs.go:282] 0 containers: []
	W1124 10:14:54.520982 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:14:54.520999 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:14:54.521014 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:14:54.540341 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:14:54.540373 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:14:54.576434 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:14:54.576469 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:14:54.641893 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:14:54.641928 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:14:54.775781 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:14:54.775800 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:14:54.775814 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:14:54.826951 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:14:54.826986 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:14:54.860450 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:14:54.860533 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:14:54.925506 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:14:54.930539 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:14:55.007469 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:14:55.007563 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:14:57.550814 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:14:57.563291 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:14:57.563376 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:14:57.598840 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:14:57.598865 1856079 cri.go:89] found id: ""
	I1124 10:14:57.598874 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:14:57.598939 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:57.602748 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:14:57.602845 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:14:57.645025 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:14:57.645067 1856079 cri.go:89] found id: ""
	I1124 10:14:57.645077 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:14:57.645156 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:57.650819 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:14:57.650913 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:14:57.727168 1856079 cri.go:89] found id: ""
	I1124 10:14:57.727220 1856079 logs.go:282] 0 containers: []
	W1124 10:14:57.727230 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:14:57.727247 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:14:57.727367 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:14:57.760048 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:14:57.760093 1856079 cri.go:89] found id: ""
	I1124 10:14:57.760102 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:14:57.760182 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:57.764987 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:14:57.765076 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:14:57.795164 1856079 cri.go:89] found id: ""
	I1124 10:14:57.795203 1856079 logs.go:282] 0 containers: []
	W1124 10:14:57.795213 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:14:57.795219 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:14:57.795302 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:14:57.833380 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:14:57.833412 1856079 cri.go:89] found id: ""
	I1124 10:14:57.833421 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:14:57.833499 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:14:57.838247 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:14:57.838336 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:14:57.875372 1856079 cri.go:89] found id: ""
	I1124 10:14:57.875419 1856079 logs.go:282] 0 containers: []
	W1124 10:14:57.875434 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:14:57.875442 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:14:57.875522 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:14:57.923849 1856079 cri.go:89] found id: ""
	I1124 10:14:57.923915 1856079 logs.go:282] 0 containers: []
	W1124 10:14:57.923938 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:14:57.923964 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:14:57.924007 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:14:57.962180 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:14:57.962261 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:14:58.072377 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:14:58.072444 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:14:58.072473 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:14:58.115004 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:14:58.115088 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:14:58.157504 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:14:58.157543 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:14:58.227341 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:14:58.227382 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:14:58.243749 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:14:58.243787 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:14:58.282201 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:14:58.282236 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:14:58.335636 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:14:58.335679 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:15:00.873628 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:15:00.885671 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:15:00.885747 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:15:00.918574 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:15:00.918602 1856079 cri.go:89] found id: ""
	I1124 10:15:00.918610 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:15:00.918668 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:15:00.922907 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:15:00.922980 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:15:00.957122 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:15:00.957143 1856079 cri.go:89] found id: ""
	I1124 10:15:00.957151 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:15:00.957216 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:15:00.961554 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:15:00.961622 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:15:01.002831 1856079 cri.go:89] found id: ""
	I1124 10:15:01.002856 1856079 logs.go:282] 0 containers: []
	W1124 10:15:01.002865 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:15:01.002877 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:15:01.002946 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:15:01.032406 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:15:01.032427 1856079 cri.go:89] found id: ""
	I1124 10:15:01.032434 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:15:01.032501 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:15:01.037095 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:15:01.037167 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:15:01.067489 1856079 cri.go:89] found id: ""
	I1124 10:15:01.067578 1856079 logs.go:282] 0 containers: []
	W1124 10:15:01.067604 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:15:01.067643 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:15:01.067743 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:15:01.105640 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:15:01.105660 1856079 cri.go:89] found id: ""
	I1124 10:15:01.105669 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:15:01.105728 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:15:01.110234 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:15:01.110311 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:15:01.152769 1856079 cri.go:89] found id: ""
	I1124 10:15:01.152792 1856079 logs.go:282] 0 containers: []
	W1124 10:15:01.152801 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:15:01.152809 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:15:01.152871 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:15:01.207838 1856079 cri.go:89] found id: ""
	I1124 10:15:01.207916 1856079 logs.go:282] 0 containers: []
	W1124 10:15:01.207941 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:15:01.207989 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:15:01.208023 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:15:01.255579 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:15:01.255617 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:15:01.305088 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:15:01.305123 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:15:01.340126 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:15:01.340161 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:15:01.372768 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:15:01.372794 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:15:01.467774 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:15:01.467814 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:15:01.571108 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:15:01.571133 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:15:01.571146 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:15:01.605052 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:15:01.605091 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:15:01.622238 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:15:01.622271 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:15:04.159126 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:15:04.170937 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:15:04.171007 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:15:04.201477 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:15:04.201497 1856079 cri.go:89] found id: ""
	I1124 10:15:04.201505 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:15:04.201565 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:15:04.205874 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:15:04.205949 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:15:04.245020 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:15:04.245048 1856079 cri.go:89] found id: ""
	I1124 10:15:04.245058 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:15:04.245119 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:15:04.249879 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:15:04.249962 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:15:04.280827 1856079 cri.go:89] found id: ""
	I1124 10:15:04.280854 1856079 logs.go:282] 0 containers: []
	W1124 10:15:04.280864 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:15:04.280871 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:15:04.280932 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:15:04.309789 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:15:04.309813 1856079 cri.go:89] found id: ""
	I1124 10:15:04.309821 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:15:04.309877 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:15:04.314580 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:15:04.314673 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:15:04.343387 1856079 cri.go:89] found id: ""
	I1124 10:15:04.343414 1856079 logs.go:282] 0 containers: []
	W1124 10:15:04.343423 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:15:04.343430 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:15:04.343488 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:15:04.391714 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:15:04.391740 1856079 cri.go:89] found id: ""
	I1124 10:15:04.391749 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:15:04.391805 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:15:04.395919 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:15:04.395997 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:15:04.449574 1856079 cri.go:89] found id: ""
	I1124 10:15:04.449598 1856079 logs.go:282] 0 containers: []
	W1124 10:15:04.449607 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:15:04.449614 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:15:04.449678 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:15:04.495143 1856079 cri.go:89] found id: ""
	I1124 10:15:04.495171 1856079 logs.go:282] 0 containers: []
	W1124 10:15:04.495180 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:15:04.495192 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:15:04.495203 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:15:04.578801 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:15:04.578824 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:15:04.578838 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:15:04.619853 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:15:04.619888 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:15:04.653636 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:15:04.653671 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:15:04.686914 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:15:04.686949 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:15:04.736972 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:15:04.737001 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:15:04.771401 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:15:04.771436 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:15:04.806857 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:15:04.806895 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:15:04.870180 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:15:04.870218 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:15:07.393046 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:15:07.406096 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:15:07.406185 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:15:07.473989 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:15:07.474013 1856079 cri.go:89] found id: ""
	I1124 10:15:07.474022 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:15:07.474078 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:15:07.487291 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:15:07.487376 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:15:07.537399 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:15:07.537422 1856079 cri.go:89] found id: ""
	I1124 10:15:07.537439 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:15:07.537496 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:15:07.545974 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:15:07.546061 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:15:07.595347 1856079 cri.go:89] found id: ""
	I1124 10:15:07.595378 1856079 logs.go:282] 0 containers: []
	W1124 10:15:07.595387 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:15:07.595393 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:15:07.595452 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:15:07.635322 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:15:07.635346 1856079 cri.go:89] found id: ""
	I1124 10:15:07.635354 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:15:07.635413 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:15:07.640540 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:15:07.640615 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:15:07.690644 1856079 cri.go:89] found id: ""
	I1124 10:15:07.690670 1856079 logs.go:282] 0 containers: []
	W1124 10:15:07.690679 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:15:07.690686 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:15:07.690745 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:15:07.730232 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:15:07.730256 1856079 cri.go:89] found id: ""
	I1124 10:15:07.730265 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:15:07.730323 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:15:07.738075 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:15:07.738155 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:15:07.767316 1856079 cri.go:89] found id: ""
	I1124 10:15:07.767342 1856079 logs.go:282] 0 containers: []
	W1124 10:15:07.767350 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:15:07.767357 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:15:07.767420 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:15:07.808940 1856079 cri.go:89] found id: ""
	I1124 10:15:07.808967 1856079 logs.go:282] 0 containers: []
	W1124 10:15:07.808977 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:15:07.808992 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:15:07.809004 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:15:07.902717 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:15:07.902775 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:15:07.926963 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:15:07.927091 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:15:08.032007 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:15:08.032065 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:15:08.104832 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:15:08.104864 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:15:08.164579 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:15:08.164692 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:15:08.253152 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:15:08.253194 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:15:08.419520 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:15:08.419545 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:15:08.419572 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:15:08.529142 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:15:08.529179 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:15:11.114582 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:15:11.125180 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:15:11.125252 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:15:11.171122 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:15:11.171157 1856079 cri.go:89] found id: ""
	I1124 10:15:11.171166 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:15:11.171222 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:15:11.179003 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:15:11.179080 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:15:11.214340 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:15:11.214364 1856079 cri.go:89] found id: ""
	I1124 10:15:11.214372 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:15:11.214430 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:15:11.220827 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:15:11.220904 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:15:11.275221 1856079 cri.go:89] found id: ""
	I1124 10:15:11.275253 1856079 logs.go:282] 0 containers: []
	W1124 10:15:11.275264 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:15:11.275270 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:15:11.275332 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:15:11.312394 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:15:11.312417 1856079 cri.go:89] found id: ""
	I1124 10:15:11.312425 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:15:11.312481 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:15:11.318931 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:15:11.319035 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:15:11.379824 1856079 cri.go:89] found id: ""
	I1124 10:15:11.379851 1856079 logs.go:282] 0 containers: []
	W1124 10:15:11.379860 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:15:11.379867 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:15:11.379927 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:15:11.420343 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:15:11.420365 1856079 cri.go:89] found id: ""
	I1124 10:15:11.420374 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:15:11.420429 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:15:11.434991 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:15:11.435070 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:15:11.478168 1856079 cri.go:89] found id: ""
	I1124 10:15:11.478193 1856079 logs.go:282] 0 containers: []
	W1124 10:15:11.478203 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:15:11.478210 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:15:11.478268 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:15:11.531509 1856079 cri.go:89] found id: ""
	I1124 10:15:11.531535 1856079 logs.go:282] 0 containers: []
	W1124 10:15:11.531545 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:15:11.531558 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:15:11.531569 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:15:11.576055 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:15:11.576095 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:15:11.653460 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:15:11.653490 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:15:11.806526 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:15:11.806549 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:15:11.806563 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:15:11.864598 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:15:11.864632 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:15:11.930785 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:15:11.930817 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:15:11.993498 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:15:11.993583 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:15:12.084817 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:15:12.084900 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:15:12.112496 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:15:12.112577 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:15:14.680867 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:15:14.691083 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:15:14.691158 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:15:14.716953 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:15:14.716971 1856079 cri.go:89] found id: ""
	I1124 10:15:14.716979 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:15:14.717036 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:15:14.721082 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:15:14.721157 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:15:14.746312 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:15:14.746331 1856079 cri.go:89] found id: ""
	I1124 10:15:14.746339 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:15:14.746394 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:15:14.750243 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:15:14.750315 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:15:14.784949 1856079 cri.go:89] found id: ""
	I1124 10:15:14.784976 1856079 logs.go:282] 0 containers: []
	W1124 10:15:14.784985 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:15:14.784991 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:15:14.785094 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:15:14.815859 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:15:14.815883 1856079 cri.go:89] found id: ""
	I1124 10:15:14.815892 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:15:14.815973 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:15:14.820781 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:15:14.820874 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:15:14.854745 1856079 cri.go:89] found id: ""
	I1124 10:15:14.854773 1856079 logs.go:282] 0 containers: []
	W1124 10:15:14.854782 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:15:14.854789 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:15:14.854890 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:15:14.926106 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:15:14.926131 1856079 cri.go:89] found id: ""
	I1124 10:15:14.926139 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:15:14.926215 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:15:14.931706 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:15:14.931808 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:15:14.990582 1856079 cri.go:89] found id: ""
	I1124 10:15:14.990610 1856079 logs.go:282] 0 containers: []
	W1124 10:15:14.990619 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:15:14.990645 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:15:14.990733 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:15:15.053947 1856079 cri.go:89] found id: ""
	I1124 10:15:15.053975 1856079 logs.go:282] 0 containers: []
	W1124 10:15:15.053985 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:15:15.054033 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:15:15.054049 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:15:15.118230 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:15:15.118309 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:15:15.136291 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:15:15.136332 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:15:15.187764 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:15:15.187800 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:15:15.228939 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:15:15.228972 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:15:15.267301 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:15:15.267378 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:15:15.365307 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:15:15.365372 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:15:15.365404 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:15:15.413897 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:15:15.413929 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:15:15.461850 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:15:15.461888 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:15:18.005478 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:15:18.023571 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:15:18.023653 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:15:18.058678 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:15:18.058698 1856079 cri.go:89] found id: ""
	I1124 10:15:18.058707 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:15:18.058764 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:15:18.065446 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:15:18.065522 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:15:18.095088 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:15:18.095108 1856079 cri.go:89] found id: ""
	I1124 10:15:18.095116 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:15:18.095173 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:15:18.099945 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:15:18.100019 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:15:18.135333 1856079 cri.go:89] found id: ""
	I1124 10:15:18.135357 1856079 logs.go:282] 0 containers: []
	W1124 10:15:18.135366 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:15:18.135373 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:15:18.135435 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:15:18.178886 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:15:18.178920 1856079 cri.go:89] found id: ""
	I1124 10:15:18.178928 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:15:18.178984 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:15:18.205047 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:15:18.205128 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:15:18.253457 1856079 cri.go:89] found id: ""
	I1124 10:15:18.253482 1856079 logs.go:282] 0 containers: []
	W1124 10:15:18.253491 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:15:18.253498 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:15:18.253560 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:15:18.290554 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:15:18.290588 1856079 cri.go:89] found id: ""
	I1124 10:15:18.290597 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:15:18.290653 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:15:18.295429 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:15:18.295500 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:15:18.342005 1856079 cri.go:89] found id: ""
	I1124 10:15:18.342029 1856079 logs.go:282] 0 containers: []
	W1124 10:15:18.342038 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:15:18.342045 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:15:18.342104 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:15:18.382052 1856079 cri.go:89] found id: ""
	I1124 10:15:18.382075 1856079 logs.go:282] 0 containers: []
	W1124 10:15:18.382084 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:15:18.382100 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:15:18.382112 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:15:18.459322 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:15:18.459396 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:15:18.482124 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:15:18.482157 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:15:18.577794 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:15:18.577866 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:15:18.577896 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:15:18.636958 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:15:18.637051 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:15:18.743334 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:15:18.743424 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:15:18.783644 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:15:18.783669 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:15:18.826814 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:15:18.826845 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:15:18.879491 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:15:18.879522 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:15:21.423408 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:15:21.433864 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:15:21.433940 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:15:21.464386 1856079 cri.go:89] found id: "6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:15:21.464407 1856079 cri.go:89] found id: ""
	I1124 10:15:21.464416 1856079 logs.go:282] 1 containers: [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5]
	I1124 10:15:21.464472 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:15:21.469255 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:15:21.469325 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:15:21.499898 1856079 cri.go:89] found id: "c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:15:21.499916 1856079 cri.go:89] found id: ""
	I1124 10:15:21.499924 1856079 logs.go:282] 1 containers: [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06]
	I1124 10:15:21.499976 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:15:21.504088 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:15:21.504166 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:15:21.533208 1856079 cri.go:89] found id: ""
	I1124 10:15:21.533229 1856079 logs.go:282] 0 containers: []
	W1124 10:15:21.533237 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:15:21.533243 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:15:21.533305 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:15:21.572807 1856079 cri.go:89] found id: "9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:15:21.572826 1856079 cri.go:89] found id: ""
	I1124 10:15:21.572833 1856079 logs.go:282] 1 containers: [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834]
	I1124 10:15:21.572889 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:15:21.578714 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:15:21.578796 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:15:21.619464 1856079 cri.go:89] found id: ""
	I1124 10:15:21.619492 1856079 logs.go:282] 0 containers: []
	W1124 10:15:21.619500 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:15:21.619507 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:15:21.619564 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:15:21.672958 1856079 cri.go:89] found id: "b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:15:21.672985 1856079 cri.go:89] found id: ""
	I1124 10:15:21.672993 1856079 logs.go:282] 1 containers: [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f]
	I1124 10:15:21.673052 1856079 ssh_runner.go:195] Run: which crictl
	I1124 10:15:21.677425 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:15:21.677500 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:15:21.732627 1856079 cri.go:89] found id: ""
	I1124 10:15:21.732658 1856079 logs.go:282] 0 containers: []
	W1124 10:15:21.732674 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:15:21.732681 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:15:21.732744 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:15:21.766331 1856079 cri.go:89] found id: ""
	I1124 10:15:21.766360 1856079 logs.go:282] 0 containers: []
	W1124 10:15:21.766369 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:15:21.766382 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:15:21.766393 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:15:21.859933 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:15:21.859959 1856079 logs.go:123] Gathering logs for etcd [c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06] ...
	I1124 10:15:21.859972 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06"
	I1124 10:15:21.895223 1856079 logs.go:123] Gathering logs for kube-controller-manager [b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f] ...
	I1124 10:15:21.895262 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f"
	I1124 10:15:21.954173 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:15:21.954206 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1124 10:15:21.999336 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:15:21.999368 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:15:22.019532 1856079 logs.go:123] Gathering logs for kube-apiserver [6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5] ...
	I1124 10:15:22.019562 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5"
	I1124 10:15:22.080815 1856079 logs.go:123] Gathering logs for kube-scheduler [9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834] ...
	I1124 10:15:22.080856 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834"
	I1124 10:15:22.128705 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:15:22.128784 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:15:22.170693 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:15:22.170778 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:15:24.736817 1856079 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:15:24.747261 1856079 kubeadm.go:602] duration metric: took 4m2.993293197s to restartPrimaryControlPlane
	W1124 10:15:24.747331 1856079 out.go:285] ! Unable to restart control-plane node(s), will reset cluster: <no value>
	! Unable to restart control-plane node(s), will reset cluster: <no value>
	I1124 10:15:24.747406 1856079 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1124 10:15:25.248260 1856079 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1124 10:15:25.264389 1856079 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1124 10:15:25.275807 1856079 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1124 10:15:25.275876 1856079 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1124 10:15:25.287608 1856079 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1124 10:15:25.287631 1856079 kubeadm.go:158] found existing configuration files:
	
	I1124 10:15:25.287684 1856079 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I1124 10:15:25.297017 1856079 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1124 10:15:25.297084 1856079 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1124 10:15:25.305492 1856079 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I1124 10:15:25.316872 1856079 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1124 10:15:25.316937 1856079 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1124 10:15:25.324893 1856079 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I1124 10:15:25.333652 1856079 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1124 10:15:25.333717 1856079 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1124 10:15:25.342012 1856079 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I1124 10:15:25.351571 1856079 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1124 10:15:25.351652 1856079 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1124 10:15:25.360932 1856079 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1124 10:15:25.407560 1856079 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-beta.0
	I1124 10:15:25.407917 1856079 kubeadm.go:319] [preflight] Running pre-flight checks
	I1124 10:15:25.525638 1856079 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1124 10:15:25.525721 1856079 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1124 10:15:25.525758 1856079 kubeadm.go:319] OS: Linux
	I1124 10:15:25.525809 1856079 kubeadm.go:319] CGROUPS_CPU: enabled
	I1124 10:15:25.525862 1856079 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1124 10:15:25.525912 1856079 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1124 10:15:25.525964 1856079 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1124 10:15:25.526021 1856079 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1124 10:15:25.526073 1856079 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1124 10:15:25.526123 1856079 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1124 10:15:25.526176 1856079 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1124 10:15:25.526229 1856079 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1124 10:15:25.621201 1856079 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1124 10:15:25.621397 1856079 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1124 10:15:25.621530 1856079 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1124 10:15:27.406896 1856079 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1124 10:15:27.410332 1856079 out.go:252]   - Generating certificates and keys ...
	I1124 10:15:27.410421 1856079 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1124 10:15:27.410529 1856079 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1124 10:15:27.410607 1856079 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1124 10:15:27.410667 1856079 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1124 10:15:27.410737 1856079 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1124 10:15:27.410790 1856079 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1124 10:15:27.410852 1856079 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1124 10:15:27.410913 1856079 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1124 10:15:27.410987 1856079 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1124 10:15:27.411059 1856079 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1124 10:15:27.411097 1856079 kubeadm.go:319] [certs] Using the existing "sa" key
	I1124 10:15:27.411153 1856079 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1124 10:15:27.917725 1856079 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1124 10:15:28.275709 1856079 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1124 10:15:28.468309 1856079 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1124 10:15:28.628145 1856079 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1124 10:15:28.882772 1856079 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1124 10:15:28.883612 1856079 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1124 10:15:28.886344 1856079 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1124 10:15:28.889593 1856079 out.go:252]   - Booting up control plane ...
	I1124 10:15:28.889705 1856079 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1124 10:15:28.889925 1856079 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1124 10:15:28.890301 1856079 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1124 10:15:28.918332 1856079 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1124 10:15:28.918468 1856079 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1124 10:15:28.928790 1856079 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1124 10:15:28.928891 1856079 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1124 10:15:28.928931 1856079 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1124 10:15:29.117574 1856079 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1124 10:15:29.117695 1856079 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1124 10:19:29.117977 1856079 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.001199788s
	I1124 10:19:29.118017 1856079 kubeadm.go:319] 
	I1124 10:19:29.118075 1856079 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1124 10:19:29.118113 1856079 kubeadm.go:319] 	- The kubelet is not running
	I1124 10:19:29.118222 1856079 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1124 10:19:29.118232 1856079 kubeadm.go:319] 
	I1124 10:19:29.118336 1856079 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1124 10:19:29.118372 1856079 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1124 10:19:29.118406 1856079 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1124 10:19:29.118410 1856079 kubeadm.go:319] 
	I1124 10:19:29.122111 1856079 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1124 10:19:29.122554 1856079 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1124 10:19:29.122665 1856079 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1124 10:19:29.122898 1856079 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1124 10:19:29.122904 1856079 kubeadm.go:319] 
	I1124 10:19:29.122972 1856079 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	W1124 10:19:29.123079 1856079 out.go:285] ! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001199788s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001199788s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	I1124 10:19:29.123154 1856079 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1124 10:19:29.541517 1856079 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1124 10:19:29.555488 1856079 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1124 10:19:29.555560 1856079 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1124 10:19:29.564144 1856079 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1124 10:19:29.564168 1856079 kubeadm.go:158] found existing configuration files:
	
	I1124 10:19:29.564224 1856079 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I1124 10:19:29.571947 1856079 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1124 10:19:29.572013 1856079 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1124 10:19:29.579584 1856079 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I1124 10:19:29.587620 1856079 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1124 10:19:29.587686 1856079 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1124 10:19:29.595031 1856079 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I1124 10:19:29.602738 1856079 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1124 10:19:29.602800 1856079 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1124 10:19:29.610572 1856079 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I1124 10:19:29.618585 1856079 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1124 10:19:29.618650 1856079 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1124 10:19:29.626054 1856079 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1124 10:19:29.667219 1856079 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-beta.0
	I1124 10:19:29.667343 1856079 kubeadm.go:319] [preflight] Running pre-flight checks
	I1124 10:19:29.745765 1856079 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1124 10:19:29.745838 1856079 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1124 10:19:29.745877 1856079 kubeadm.go:319] OS: Linux
	I1124 10:19:29.745922 1856079 kubeadm.go:319] CGROUPS_CPU: enabled
	I1124 10:19:29.745971 1856079 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1124 10:19:29.746018 1856079 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1124 10:19:29.746066 1856079 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1124 10:19:29.746114 1856079 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1124 10:19:29.746165 1856079 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1124 10:19:29.746210 1856079 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1124 10:19:29.746258 1856079 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1124 10:19:29.746305 1856079 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1124 10:19:29.811514 1856079 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1124 10:19:29.811706 1856079 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1124 10:19:29.811840 1856079 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1124 10:19:29.818729 1856079 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1124 10:19:29.824188 1856079 out.go:252]   - Generating certificates and keys ...
	I1124 10:19:29.824356 1856079 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1124 10:19:29.824471 1856079 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1124 10:19:29.824564 1856079 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1124 10:19:29.824630 1856079 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1124 10:19:29.824707 1856079 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1124 10:19:29.824767 1856079 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1124 10:19:29.824843 1856079 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1124 10:19:29.824943 1856079 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1124 10:19:29.825025 1856079 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1124 10:19:29.825098 1856079 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1124 10:19:29.825136 1856079 kubeadm.go:319] [certs] Using the existing "sa" key
	I1124 10:19:29.825192 1856079 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1124 10:19:30.013019 1856079 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1124 10:19:30.069024 1856079 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1124 10:19:30.193243 1856079 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1124 10:19:30.293430 1856079 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1124 10:19:30.446318 1856079 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1124 10:19:30.447147 1856079 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1124 10:19:30.452066 1856079 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1124 10:19:30.455516 1856079 out.go:252]   - Booting up control plane ...
	I1124 10:19:30.455698 1856079 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1124 10:19:30.455839 1856079 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1124 10:19:30.455958 1856079 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1124 10:19:30.476525 1856079 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1124 10:19:30.477111 1856079 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1124 10:19:30.486085 1856079 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1124 10:19:30.486429 1856079 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1124 10:19:30.486702 1856079 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1124 10:19:30.622930 1856079 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1124 10:19:30.623053 1856079 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1124 10:23:30.622032 1856079 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.001254795s
	I1124 10:23:30.622067 1856079 kubeadm.go:319] 
	I1124 10:23:30.622121 1856079 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1124 10:23:30.622152 1856079 kubeadm.go:319] 	- The kubelet is not running
	I1124 10:23:30.622251 1856079 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1124 10:23:30.622257 1856079 kubeadm.go:319] 
	I1124 10:23:30.622355 1856079 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1124 10:23:30.622385 1856079 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1124 10:23:30.622415 1856079 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1124 10:23:30.622419 1856079 kubeadm.go:319] 
	I1124 10:23:30.626605 1856079 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1124 10:23:30.627034 1856079 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1124 10:23:30.627143 1856079 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1124 10:23:30.627378 1856079 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1124 10:23:30.627383 1856079 kubeadm.go:319] 
	I1124 10:23:30.627461 1856079 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1124 10:23:30.627514 1856079 kubeadm.go:403] duration metric: took 12m8.973699951s to StartCluster
	I1124 10:23:30.627561 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:23:30.627621 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:23:30.664923 1856079 cri.go:89] found id: ""
	I1124 10:23:30.664944 1856079 logs.go:282] 0 containers: []
	W1124 10:23:30.664953 1856079 logs.go:284] No container was found matching "kube-apiserver"
	I1124 10:23:30.664959 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:23:30.665018 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:23:30.701134 1856079 cri.go:89] found id: ""
	I1124 10:23:30.701156 1856079 logs.go:282] 0 containers: []
	W1124 10:23:30.701164 1856079 logs.go:284] No container was found matching "etcd"
	I1124 10:23:30.701171 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:23:30.701233 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:23:30.735694 1856079 cri.go:89] found id: ""
	I1124 10:23:30.735716 1856079 logs.go:282] 0 containers: []
	W1124 10:23:30.735725 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:23:30.735731 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:23:30.735789 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:23:30.767541 1856079 cri.go:89] found id: ""
	I1124 10:23:30.767562 1856079 logs.go:282] 0 containers: []
	W1124 10:23:30.767571 1856079 logs.go:284] No container was found matching "kube-scheduler"
	I1124 10:23:30.767582 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:23:30.767639 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:23:30.797084 1856079 cri.go:89] found id: ""
	I1124 10:23:30.797106 1856079 logs.go:282] 0 containers: []
	W1124 10:23:30.797114 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:23:30.797121 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:23:30.797178 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:23:30.833027 1856079 cri.go:89] found id: ""
	I1124 10:23:30.833099 1856079 logs.go:282] 0 containers: []
	W1124 10:23:30.833170 1856079 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 10:23:30.833195 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:23:30.833288 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:23:30.867701 1856079 cri.go:89] found id: ""
	I1124 10:23:30.867774 1856079 logs.go:282] 0 containers: []
	W1124 10:23:30.867798 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:23:30.867819 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:23:30.867911 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:23:30.903593 1856079 cri.go:89] found id: ""
	I1124 10:23:30.903665 1856079 logs.go:282] 0 containers: []
	W1124 10:23:30.903688 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:23:30.903713 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:23:30.903758 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:23:30.982104 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:23:30.982186 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:23:31.000228 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:23:31.000255 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:23:31.092172 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:23:31.092203 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:23:31.092216 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:23:31.144366 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:23:31.144410 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	W1124 10:23:31.217122 1856079 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001254795s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	W1124 10:23:31.217172 1856079 out.go:285] * 
	* 
	W1124 10:23:31.217227 1856079 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001254795s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001254795s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1124 10:23:31.217245 1856079 out.go:285] * 
	* 
	W1124 10:23:31.219547 1856079 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1124 10:23:31.229855 1856079 out.go:203] 
	W1124 10:23:31.233484 1856079 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001254795s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001254795s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1124 10:23:31.233540 1856079 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	* Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1124 10:23:31.233566 1856079 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	* Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1124 10:23:31.237388 1856079 out.go:203] 

                                                
                                                
** /stderr **
version_upgrade_test.go:245: failed to upgrade with newest k8s version. args: out/minikube-linux-arm64 start -p kubernetes-upgrade-188777 --memory=3072 --kubernetes-version=v1.35.0-beta.0 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd : exit status 109
version_upgrade_test.go:248: (dbg) Run:  kubectl --context kubernetes-upgrade-188777 version --output=json
version_upgrade_test.go:248: (dbg) Non-zero exit: kubectl --context kubernetes-upgrade-188777 version --output=json: exit status 1 (141.459357ms)

                                                
                                                
-- stdout --
	{
	  "clientVersion": {
	    "major": "1",
	    "minor": "33",
	    "gitVersion": "v1.33.2",
	    "gitCommit": "a57b6f7709f6c2722b92f07b8b4c48210a51fc40",
	    "gitTreeState": "clean",
	    "buildDate": "2025-06-17T18:41:31Z",
	    "goVersion": "go1.24.4",
	    "compiler": "gc",
	    "platform": "linux/arm64"
	  },
	  "kustomizeVersion": "v5.6.0"
	}

                                                
                                                
-- /stdout --
** stderr ** 
	The connection to the server 192.168.76.2:8443 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
version_upgrade_test.go:250: error running kubectl: exit status 1
panic.go:615: *** TestKubernetesUpgrade FAILED at 2025-11-24 10:23:32.410667571 +0000 UTC m=+6027.777974117
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestKubernetesUpgrade]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestKubernetesUpgrade]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect kubernetes-upgrade-188777
helpers_test.go:243: (dbg) docker inspect kubernetes-upgrade-188777:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "5c25da7c8f3b43c276b8e67dfeb473da6c28e70813c0fed65664127113c28b3a",
	        "Created": "2025-11-24T10:10:22.34499481Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 1856413,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-11-24T10:10:59.387357079Z",
	            "FinishedAt": "2025-11-24T10:10:58.112580733Z"
	        },
	        "Image": "sha256:572c983e466f1f784136812eef5cc59ac623db764bc7704d3676c4643993fd08",
	        "ResolvConfPath": "/var/lib/docker/containers/5c25da7c8f3b43c276b8e67dfeb473da6c28e70813c0fed65664127113c28b3a/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/5c25da7c8f3b43c276b8e67dfeb473da6c28e70813c0fed65664127113c28b3a/hostname",
	        "HostsPath": "/var/lib/docker/containers/5c25da7c8f3b43c276b8e67dfeb473da6c28e70813c0fed65664127113c28b3a/hosts",
	        "LogPath": "/var/lib/docker/containers/5c25da7c8f3b43c276b8e67dfeb473da6c28e70813c0fed65664127113c28b3a/5c25da7c8f3b43c276b8e67dfeb473da6c28e70813c0fed65664127113c28b3a-json.log",
	        "Name": "/kubernetes-upgrade-188777",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "kubernetes-upgrade-188777:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "kubernetes-upgrade-188777",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 3221225472,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 6442450944,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "5c25da7c8f3b43c276b8e67dfeb473da6c28e70813c0fed65664127113c28b3a",
	                "LowerDir": "/var/lib/docker/overlay2/03de757a6b58240f00cc9f10fa2a36c19fa646cc078dafaa2617bcbb2115b293-init/diff:/var/lib/docker/overlay2/bf294786c7ade59cb761c7bdcc27045362c3779b00a569b685443f34ade17737/diff",
	                "MergedDir": "/var/lib/docker/overlay2/03de757a6b58240f00cc9f10fa2a36c19fa646cc078dafaa2617bcbb2115b293/merged",
	                "UpperDir": "/var/lib/docker/overlay2/03de757a6b58240f00cc9f10fa2a36c19fa646cc078dafaa2617bcbb2115b293/diff",
	                "WorkDir": "/var/lib/docker/overlay2/03de757a6b58240f00cc9f10fa2a36c19fa646cc078dafaa2617bcbb2115b293/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "volume",
	                "Name": "kubernetes-upgrade-188777",
	                "Source": "/var/lib/docker/volumes/kubernetes-upgrade-188777/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            },
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            }
	        ],
	        "Config": {
	            "Hostname": "kubernetes-upgrade-188777",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8443/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "kubernetes-upgrade-188777",
	                "name.minikube.sigs.k8s.io": "kubernetes-upgrade-188777",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "26f337bcbf6377958d2e0a66c2d73eff130f70cb7cb4f4ad3d717e890afde9b4",
	            "SandboxKey": "/var/run/docker/netns/26f337bcbf63",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34914"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34915"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34918"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34916"
	                    }
	                ],
	                "8443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34917"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "kubernetes-upgrade-188777": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.76.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "da:e8:fc:d0:f5:e7",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "31e3e0575d24a908b7343adfd3c1163001c535a3c44e4655fd81da88160e6d2a",
	                    "EndpointID": "c7873bdc6c673b40b277dc64dcd26dad588ed00f68486296039fe970cb52eb5a",
	                    "Gateway": "192.168.76.1",
	                    "IPAddress": "192.168.76.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "kubernetes-upgrade-188777",
	                        "5c25da7c8f3b"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p kubernetes-upgrade-188777 -n kubernetes-upgrade-188777
helpers_test.go:247: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p kubernetes-upgrade-188777 -n kubernetes-upgrade-188777: exit status 2 (390.509292ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:247: status error: exit status 2 (may be ok)
helpers_test.go:252: <<< TestKubernetesUpgrade FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestKubernetesUpgrade]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p kubernetes-upgrade-188777 logs -n 25
helpers_test.go:255: (dbg) Done: out/minikube-linux-arm64 -p kubernetes-upgrade-188777 logs -n 25: (1.012752061s)
helpers_test.go:260: TestKubernetesUpgrade logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬──────────────────────────┬─────────┬─────────┬─────────────────────┬────────────
─────────┐
	│ COMMAND │                                                                                                                        ARGS                                                                                                                         │         PROFILE          │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼──────────────────────────┼─────────┼─────────┼─────────────────────┼────────────
─────────┤
	│ ssh     │ -p cilium-713538 sudo systemctl cat crio --no-pager                                                                                                                                                                                                 │ cilium-713538            │ jenkins │ v1.37.0 │ 24 Nov 25 10:15 UTC │                     │
	│ ssh     │ -p cilium-713538 sudo find /etc/crio -type f -exec sh -c 'echo {}; cat {}' \;                                                                                                                                                                       │ cilium-713538            │ jenkins │ v1.37.0 │ 24 Nov 25 10:15 UTC │                     │
	│ ssh     │ -p cilium-713538 sudo crio config                                                                                                                                                                                                                   │ cilium-713538            │ jenkins │ v1.37.0 │ 24 Nov 25 10:15 UTC │                     │
	│ delete  │ -p cilium-713538                                                                                                                                                                                                                                    │ cilium-713538            │ jenkins │ v1.37.0 │ 24 Nov 25 10:15 UTC │ 24 Nov 25 10:15 UTC │
	│ start   │ -p force-systemd-env-924581 --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd                                                                                                                                    │ force-systemd-env-924581 │ jenkins │ v1.37.0 │ 24 Nov 25 10:15 UTC │ 24 Nov 25 10:16 UTC │
	│ ssh     │ force-systemd-env-924581 ssh cat /etc/containerd/config.toml                                                                                                                                                                                        │ force-systemd-env-924581 │ jenkins │ v1.37.0 │ 24 Nov 25 10:16 UTC │ 24 Nov 25 10:16 UTC │
	│ delete  │ -p force-systemd-env-924581                                                                                                                                                                                                                         │ force-systemd-env-924581 │ jenkins │ v1.37.0 │ 24 Nov 25 10:16 UTC │ 24 Nov 25 10:16 UTC │
	│ start   │ -p cert-expiration-991536 --memory=3072 --cert-expiration=3m --driver=docker  --container-runtime=containerd                                                                                                                                        │ cert-expiration-991536   │ jenkins │ v1.37.0 │ 24 Nov 25 10:16 UTC │ 24 Nov 25 10:16 UTC │
	│ start   │ -p cert-expiration-991536 --memory=3072 --cert-expiration=8760h --driver=docker  --container-runtime=containerd                                                                                                                                     │ cert-expiration-991536   │ jenkins │ v1.37.0 │ 24 Nov 25 10:19 UTC │ 24 Nov 25 10:19 UTC │
	│ delete  │ -p cert-expiration-991536                                                                                                                                                                                                                           │ cert-expiration-991536   │ jenkins │ v1.37.0 │ 24 Nov 25 10:19 UTC │ 24 Nov 25 10:19 UTC │
	│ start   │ -p cert-options-477984 --memory=3072 --apiserver-ips=127.0.0.1 --apiserver-ips=192.168.15.15 --apiserver-names=localhost --apiserver-names=www.google.com --apiserver-port=8555 --driver=docker  --container-runtime=containerd                     │ cert-options-477984      │ jenkins │ v1.37.0 │ 24 Nov 25 10:19 UTC │ 24 Nov 25 10:20 UTC │
	│ ssh     │ cert-options-477984 ssh openssl x509 -text -noout -in /var/lib/minikube/certs/apiserver.crt                                                                                                                                                         │ cert-options-477984      │ jenkins │ v1.37.0 │ 24 Nov 25 10:20 UTC │ 24 Nov 25 10:20 UTC │
	│ ssh     │ -p cert-options-477984 -- sudo cat /etc/kubernetes/admin.conf                                                                                                                                                                                       │ cert-options-477984      │ jenkins │ v1.37.0 │ 24 Nov 25 10:20 UTC │ 24 Nov 25 10:20 UTC │
	│ delete  │ -p cert-options-477984                                                                                                                                                                                                                              │ cert-options-477984      │ jenkins │ v1.37.0 │ 24 Nov 25 10:20 UTC │ 24 Nov 25 10:20 UTC │
	│ start   │ -p old-k8s-version-662905 --memory=3072 --alsologtostderr --wait=true --kvm-network=default --kvm-qemu-uri=qemu:///system --disable-driver-mounts --keep-context=false --driver=docker  --container-runtime=containerd --kubernetes-version=v1.28.0 │ old-k8s-version-662905   │ jenkins │ v1.37.0 │ 24 Nov 25 10:20 UTC │ 24 Nov 25 10:21 UTC │
	│ addons  │ enable metrics-server -p old-k8s-version-662905 --images=MetricsServer=registry.k8s.io/echoserver:1.4 --registries=MetricsServer=fake.domain                                                                                                        │ old-k8s-version-662905   │ jenkins │ v1.37.0 │ 24 Nov 25 10:21 UTC │ 24 Nov 25 10:21 UTC │
	│ stop    │ -p old-k8s-version-662905 --alsologtostderr -v=3                                                                                                                                                                                                    │ old-k8s-version-662905   │ jenkins │ v1.37.0 │ 24 Nov 25 10:21 UTC │ 24 Nov 25 10:21 UTC │
	│ addons  │ enable dashboard -p old-k8s-version-662905 --images=MetricsScraper=registry.k8s.io/echoserver:1.4                                                                                                                                                   │ old-k8s-version-662905   │ jenkins │ v1.37.0 │ 24 Nov 25 10:21 UTC │ 24 Nov 25 10:21 UTC │
	│ start   │ -p old-k8s-version-662905 --memory=3072 --alsologtostderr --wait=true --kvm-network=default --kvm-qemu-uri=qemu:///system --disable-driver-mounts --keep-context=false --driver=docker  --container-runtime=containerd --kubernetes-version=v1.28.0 │ old-k8s-version-662905   │ jenkins │ v1.37.0 │ 24 Nov 25 10:21 UTC │ 24 Nov 25 10:22 UTC │
	│ image   │ old-k8s-version-662905 image list --format=json                                                                                                                                                                                                     │ old-k8s-version-662905   │ jenkins │ v1.37.0 │ 24 Nov 25 10:23 UTC │ 24 Nov 25 10:23 UTC │
	│ pause   │ -p old-k8s-version-662905 --alsologtostderr -v=1                                                                                                                                                                                                    │ old-k8s-version-662905   │ jenkins │ v1.37.0 │ 24 Nov 25 10:23 UTC │ 24 Nov 25 10:23 UTC │
	│ unpause │ -p old-k8s-version-662905 --alsologtostderr -v=1                                                                                                                                                                                                    │ old-k8s-version-662905   │ jenkins │ v1.37.0 │ 24 Nov 25 10:23 UTC │ 24 Nov 25 10:23 UTC │
	│ delete  │ -p old-k8s-version-662905                                                                                                                                                                                                                           │ old-k8s-version-662905   │ jenkins │ v1.37.0 │ 24 Nov 25 10:23 UTC │ 24 Nov 25 10:23 UTC │
	│ delete  │ -p old-k8s-version-662905                                                                                                                                                                                                                           │ old-k8s-version-662905   │ jenkins │ v1.37.0 │ 24 Nov 25 10:23 UTC │ 24 Nov 25 10:23 UTC │
	│ start   │ -p no-preload-164104 --memory=3072 --alsologtostderr --wait=true --preload=false --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0                                                                                │ no-preload-164104        │ jenkins │ v1.37.0 │ 24 Nov 25 10:23 UTC │                     │
	└─────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴──────────────────────────┴─────────┴─────────┴─────────────────────┴────────────
─────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/11/24 10:23:06
	Running on machine: ip-172-31-30-239
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1124 10:23:06.785077 1904000 out.go:360] Setting OutFile to fd 1 ...
	I1124 10:23:06.785203 1904000 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 10:23:06.785215 1904000 out.go:374] Setting ErrFile to fd 2...
	I1124 10:23:06.785219 1904000 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 10:23:06.785511 1904000 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
	I1124 10:23:06.785923 1904000 out.go:368] Setting JSON to false
	I1124 10:23:06.786947 1904000 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":32716,"bootTime":1763947071,"procs":179,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1124 10:23:06.787024 1904000 start.go:143] virtualization:  
	I1124 10:23:06.791217 1904000 out.go:179] * [no-preload-164104] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1124 10:23:06.794569 1904000 out.go:179]   - MINIKUBE_LOCATION=21978
	I1124 10:23:06.794735 1904000 notify.go:221] Checking for updates...
	I1124 10:23:06.800983 1904000 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1124 10:23:06.804140 1904000 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 10:23:06.807273 1904000 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21978-1652607/.minikube
	I1124 10:23:06.810546 1904000 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1124 10:23:06.813575 1904000 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1124 10:23:06.817273 1904000 config.go:182] Loaded profile config "kubernetes-upgrade-188777": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1124 10:23:06.817460 1904000 driver.go:422] Setting default libvirt URI to qemu:///system
	I1124 10:23:06.844112 1904000 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1124 10:23:06.844255 1904000 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 10:23:06.902829 1904000 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:37 OomKillDisable:true NGoroutines:52 SystemTime:2025-11-24 10:23:06.893525152 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 10:23:06.902935 1904000 docker.go:319] overlay module found
	I1124 10:23:06.906081 1904000 out.go:179] * Using the docker driver based on user configuration
	I1124 10:23:06.908999 1904000 start.go:309] selected driver: docker
	I1124 10:23:06.909027 1904000 start.go:927] validating driver "docker" against <nil>
	I1124 10:23:06.909048 1904000 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1124 10:23:06.909764 1904000 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 10:23:06.972015 1904000 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:37 OomKillDisable:true NGoroutines:52 SystemTime:2025-11-24 10:23:06.962065507 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 10:23:06.972174 1904000 start_flags.go:327] no existing cluster config was found, will generate one from the flags 
	I1124 10:23:06.972429 1904000 start_flags.go:992] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1124 10:23:06.975545 1904000 out.go:179] * Using Docker driver with root privileges
	I1124 10:23:06.978432 1904000 cni.go:84] Creating CNI manager for ""
	I1124 10:23:06.978590 1904000 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1124 10:23:06.978601 1904000 start_flags.go:336] Found "CNI" CNI - setting NetworkPlugin=cni
	I1124 10:23:06.978689 1904000 start.go:353] cluster config:
	{Name:no-preload-164104 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:no-preload-164104 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local C
ontainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP
: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 10:23:06.981795 1904000 out.go:179] * Starting "no-preload-164104" primary control-plane node in "no-preload-164104" cluster
	I1124 10:23:06.984777 1904000 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1124 10:23:06.987741 1904000 out.go:179] * Pulling base image v0.0.48-1763789673-21948 ...
	I1124 10:23:06.990563 1904000 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1124 10:23:06.990643 1904000 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f in local docker daemon
	I1124 10:23:06.990693 1904000 profile.go:143] Saving config to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/no-preload-164104/config.json ...
	I1124 10:23:06.990767 1904000 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/no-preload-164104/config.json: {Name:mk4a47dc6315619a6d7e1c0c16083012e7a4949d Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 10:23:06.991820 1904000 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 10:23:07.031714 1904000 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f in local docker daemon, skipping pull
	I1124 10:23:07.031755 1904000 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f exists in daemon, skipping load
	I1124 10:23:07.031776 1904000 cache.go:243] Successfully downloaded all kic artifacts
	I1124 10:23:07.031811 1904000 start.go:360] acquireMachinesLock for no-preload-164104: {Name:mk7cfd70748c2da1ad2f77bca120090c8d096e16 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 10:23:07.031945 1904000 start.go:364] duration metric: took 110.024µs to acquireMachinesLock for "no-preload-164104"
	I1124 10:23:07.031973 1904000 start.go:93] Provisioning new machine with config: &{Name:no-preload-164104 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:no-preload-164104 Namespace:default APIServerHAVIP: AP
IServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNS
Log:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1124 10:23:07.032073 1904000 start.go:125] createHost starting for "" (driver="docker")
	I1124 10:23:07.035634 1904000 out.go:252] * Creating docker container (CPUs=2, Memory=3072MB) ...
	I1124 10:23:07.035990 1904000 start.go:159] libmachine.API.Create for "no-preload-164104" (driver="docker")
	I1124 10:23:07.036042 1904000 client.go:173] LocalClient.Create starting
	I1124 10:23:07.036137 1904000 main.go:143] libmachine: Reading certificate data from /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem
	I1124 10:23:07.036180 1904000 main.go:143] libmachine: Decoding PEM data...
	I1124 10:23:07.036206 1904000 main.go:143] libmachine: Parsing certificate...
	I1124 10:23:07.036268 1904000 main.go:143] libmachine: Reading certificate data from /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem
	I1124 10:23:07.036291 1904000 main.go:143] libmachine: Decoding PEM data...
	I1124 10:23:07.036310 1904000 main.go:143] libmachine: Parsing certificate...
	I1124 10:23:07.036805 1904000 cli_runner.go:164] Run: docker network inspect no-preload-164104 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	W1124 10:23:07.054874 1904000 cli_runner.go:211] docker network inspect no-preload-164104 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}" returned with exit code 1
	I1124 10:23:07.054975 1904000 network_create.go:284] running [docker network inspect no-preload-164104] to gather additional debugging logs...
	I1124 10:23:07.054996 1904000 cli_runner.go:164] Run: docker network inspect no-preload-164104
	W1124 10:23:07.088140 1904000 cli_runner.go:211] docker network inspect no-preload-164104 returned with exit code 1
	I1124 10:23:07.088169 1904000 network_create.go:287] error running [docker network inspect no-preload-164104]: docker network inspect no-preload-164104: exit status 1
	stdout:
	[]
	
	stderr:
	Error response from daemon: network no-preload-164104 not found
	I1124 10:23:07.088184 1904000 network_create.go:289] output of [docker network inspect no-preload-164104]: -- stdout --
	[]
	
	-- /stdout --
	** stderr ** 
	Error response from daemon: network no-preload-164104 not found
	
	** /stderr **
	I1124 10:23:07.088292 1904000 cli_runner.go:164] Run: docker network inspect bridge --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1124 10:23:07.105821 1904000 network.go:211] skipping subnet 192.168.49.0/24 that is taken: &{IP:192.168.49.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.49.0/24 Gateway:192.168.49.1 ClientMin:192.168.49.2 ClientMax:192.168.49.254 Broadcast:192.168.49.255 IsPrivate:true Interface:{IfaceName:br-150d0a6dddcd IfaceIPv4:192.168.49.1 IfaceMTU:1500 IfaceMAC:e2:5f:fe:86:6f:c9} reservation:<nil>}
	I1124 10:23:07.106090 1904000 network.go:211] skipping subnet 192.168.58.0/24 that is taken: &{IP:192.168.58.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.58.0/24 Gateway:192.168.58.1 ClientMin:192.168.58.2 ClientMax:192.168.58.254 Broadcast:192.168.58.255 IsPrivate:true Interface:{IfaceName:br-f63b6a27aede IfaceIPv4:192.168.58.1 IfaceMTU:1500 IfaceMAC:22:51:8a:90:6c:2e} reservation:<nil>}
	I1124 10:23:07.106375 1904000 network.go:211] skipping subnet 192.168.67.0/24 that is taken: &{IP:192.168.67.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.67.0/24 Gateway:192.168.67.1 ClientMin:192.168.67.2 ClientMax:192.168.67.254 Broadcast:192.168.67.255 IsPrivate:true Interface:{IfaceName:br-ce60bf667501 IfaceIPv4:192.168.67.1 IfaceMTU:1500 IfaceMAC:f6:ce:76:94:d1:83} reservation:<nil>}
	I1124 10:23:07.106735 1904000 network.go:211] skipping subnet 192.168.76.0/24 that is taken: &{IP:192.168.76.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.76.0/24 Gateway:192.168.76.1 ClientMin:192.168.76.2 ClientMax:192.168.76.254 Broadcast:192.168.76.255 IsPrivate:true Interface:{IfaceName:br-31e3e0575d24 IfaceIPv4:192.168.76.1 IfaceMTU:1500 IfaceMAC:be:56:e6:fe:ca:a9} reservation:<nil>}
	I1124 10:23:07.107165 1904000 network.go:206] using free private subnet 192.168.85.0/24: &{IP:192.168.85.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.85.0/24 Gateway:192.168.85.1 ClientMin:192.168.85.2 ClientMax:192.168.85.254 Broadcast:192.168.85.255 IsPrivate:true Interface:{IfaceName: IfaceIPv4: IfaceMTU:0 IfaceMAC:} reservation:0x400197fd50}
	I1124 10:23:07.107183 1904000 network_create.go:124] attempt to create docker network no-preload-164104 192.168.85.0/24 with gateway 192.168.85.1 and MTU of 1500 ...
	I1124 10:23:07.107248 1904000 cli_runner.go:164] Run: docker network create --driver=bridge --subnet=192.168.85.0/24 --gateway=192.168.85.1 -o --ip-masq -o --icc -o com.docker.network.driver.mtu=1500 --label=created_by.minikube.sigs.k8s.io=true --label=name.minikube.sigs.k8s.io=no-preload-164104 no-preload-164104
	I1124 10:23:07.186011 1904000 network_create.go:108] docker network no-preload-164104 192.168.85.0/24 created
	I1124 10:23:07.186040 1904000 kic.go:121] calculated static IP "192.168.85.2" for the "no-preload-164104" container
	I1124 10:23:07.186132 1904000 cli_runner.go:164] Run: docker ps -a --format {{.Names}}
	I1124 10:23:07.211129 1904000 cli_runner.go:164] Run: docker volume create no-preload-164104 --label name.minikube.sigs.k8s.io=no-preload-164104 --label created_by.minikube.sigs.k8s.io=true
	I1124 10:23:07.225918 1904000 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 10:23:07.229996 1904000 oci.go:103] Successfully created a docker volume no-preload-164104
	I1124 10:23:07.230104 1904000 cli_runner.go:164] Run: docker run --rm --name no-preload-164104-preload-sidecar --label created_by.minikube.sigs.k8s.io=true --label name.minikube.sigs.k8s.io=no-preload-164104 --entrypoint /usr/bin/test -v no-preload-164104:/var gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f -d /var/lib
	I1124 10:23:07.395977 1904000 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 10:23:07.567777 1904000 cache.go:107] acquiring lock: {Name:mk22a10f0ce1f3295b61e7e76c455d0494a3e278 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 10:23:07.567876 1904000 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 exists
	I1124 10:23:07.567885 1904000 cache.go:96] cache image "gcr.io/k8s-minikube/storage-provisioner:v5" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5" took 124.136µs
	I1124 10:23:07.567894 1904000 cache.go:80] save to tar file gcr.io/k8s-minikube/storage-provisioner:v5 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 succeeded
	I1124 10:23:07.567906 1904000 cache.go:107] acquiring lock: {Name:mk1cf42e67442503a46c578224bd3cb68bf682d4 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 10:23:07.567936 1904000 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 exists
	I1124 10:23:07.567941 1904000 cache.go:96] cache image "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0" took 37.145µs
	I1124 10:23:07.567951 1904000 cache.go:80] save to tar file registry.k8s.io/kube-apiserver:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 succeeded
	I1124 10:23:07.567961 1904000 cache.go:107] acquiring lock: {Name:mkfdc49c8e68aee34cee0c9d441ae8a4dca675c6 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 10:23:07.567988 1904000 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 exists
	I1124 10:23:07.567993 1904000 cache.go:96] cache image "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0" took 33.28µs
	I1124 10:23:07.567999 1904000 cache.go:80] save to tar file registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 succeeded
	I1124 10:23:07.568025 1904000 cache.go:107] acquiring lock: {Name:mkdbf38e05e2c47c1a7a906a2236e9e7020a94c5 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 10:23:07.568053 1904000 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 exists
	I1124 10:23:07.568063 1904000 cache.go:96] cache image "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0" took 34.429µs
	I1124 10:23:07.568069 1904000 cache.go:80] save to tar file registry.k8s.io/kube-scheduler:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 succeeded
	I1124 10:23:07.568078 1904000 cache.go:107] acquiring lock: {Name:mk80fdbe7cdb5bc17c2a82b4ecfd00214559a435 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 10:23:07.568104 1904000 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 exists
	I1124 10:23:07.568108 1904000 cache.go:96] cache image "registry.k8s.io/kube-proxy:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0" took 31.459µs
	I1124 10:23:07.568113 1904000 cache.go:80] save to tar file registry.k8s.io/kube-proxy:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 succeeded
	I1124 10:23:07.568124 1904000 cache.go:107] acquiring lock: {Name:mk85f1502dbb97830776608fb729eb3605e112e6 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 10:23:07.568152 1904000 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 exists
	I1124 10:23:07.568157 1904000 cache.go:96] cache image "registry.k8s.io/pause:3.10.1" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1" took 34.421µs
	I1124 10:23:07.568167 1904000 cache.go:80] save to tar file registry.k8s.io/pause:3.10.1 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 succeeded
	I1124 10:23:07.568176 1904000 cache.go:107] acquiring lock: {Name:mk46ce3b59d7e062b3dbc8a90fe5b4231f256471 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 10:23:07.568200 1904000 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0 exists
	I1124 10:23:07.568205 1904000 cache.go:96] cache image "registry.k8s.io/etcd:3.5.24-0" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0" took 30.137µs
	I1124 10:23:07.568212 1904000 cache.go:80] save to tar file registry.k8s.io/etcd:3.5.24-0 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0 succeeded
	I1124 10:23:07.568220 1904000 cache.go:107] acquiring lock: {Name:mk726502cb84c177b2e14fee88512325761511c5 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1124 10:23:07.568244 1904000 cache.go:115] /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 exists
	I1124 10:23:07.568249 1904000 cache.go:96] cache image "registry.k8s.io/coredns/coredns:v1.13.1" -> "/home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1" took 29.473µs
	I1124 10:23:07.568255 1904000 cache.go:80] save to tar file registry.k8s.io/coredns/coredns:v1.13.1 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 succeeded
	I1124 10:23:07.568261 1904000 cache.go:87] Successfully saved all images to host disk.
	I1124 10:23:07.796850 1904000 oci.go:107] Successfully prepared a docker volume no-preload-164104
	I1124 10:23:07.796912 1904000 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	W1124 10:23:07.797041 1904000 cgroups_linux.go:77] Your kernel does not support swap limit capabilities or the cgroup is not mounted.
	I1124 10:23:07.797177 1904000 cli_runner.go:164] Run: docker info --format "'{{json .SecurityOptions}}'"
	I1124 10:23:07.850912 1904000 cli_runner.go:164] Run: docker run -d -t --privileged --security-opt seccomp=unconfined --tmpfs /tmp --tmpfs /run -v /lib/modules:/lib/modules:ro --hostname no-preload-164104 --name no-preload-164104 --label created_by.minikube.sigs.k8s.io=true --label name.minikube.sigs.k8s.io=no-preload-164104 --label role.minikube.sigs.k8s.io= --label mode.minikube.sigs.k8s.io=no-preload-164104 --network no-preload-164104 --ip 192.168.85.2 --volume no-preload-164104:/var --security-opt apparmor=unconfined --memory=3072mb --cpus=2 -e container=docker --expose 8443 --publish=127.0.0.1::8443 --publish=127.0.0.1::22 --publish=127.0.0.1::2376 --publish=127.0.0.1::5000 --publish=127.0.0.1::32443 gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f
	I1124 10:23:08.172913 1904000 cli_runner.go:164] Run: docker container inspect no-preload-164104 --format={{.State.Running}}
	I1124 10:23:08.195240 1904000 cli_runner.go:164] Run: docker container inspect no-preload-164104 --format={{.State.Status}}
	I1124 10:23:08.219263 1904000 cli_runner.go:164] Run: docker exec no-preload-164104 stat /var/lib/dpkg/alternatives/iptables
	I1124 10:23:08.274152 1904000 oci.go:144] the created container "no-preload-164104" has a running status.
	I1124 10:23:08.274180 1904000 kic.go:225] Creating ssh key for kic: /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/no-preload-164104/id_rsa...
	I1124 10:23:08.760965 1904000 kic_runner.go:191] docker (temp): /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/no-preload-164104/id_rsa.pub --> /home/docker/.ssh/authorized_keys (381 bytes)
	I1124 10:23:08.780899 1904000 cli_runner.go:164] Run: docker container inspect no-preload-164104 --format={{.State.Status}}
	I1124 10:23:08.799531 1904000 kic_runner.go:93] Run: chown docker:docker /home/docker/.ssh/authorized_keys
	I1124 10:23:08.799554 1904000 kic_runner.go:114] Args: [docker exec --privileged no-preload-164104 chown docker:docker /home/docker/.ssh/authorized_keys]
	I1124 10:23:08.840707 1904000 cli_runner.go:164] Run: docker container inspect no-preload-164104 --format={{.State.Status}}
	I1124 10:23:08.858152 1904000 machine.go:94] provisionDockerMachine start ...
	I1124 10:23:08.858254 1904000 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" no-preload-164104
	I1124 10:23:08.875690 1904000 main.go:143] libmachine: Using SSH client type: native
	I1124 10:23:08.876023 1904000 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34964 <nil> <nil>}
	I1124 10:23:08.876038 1904000 main.go:143] libmachine: About to run SSH command:
	hostname
	I1124 10:23:08.876701 1904000 main.go:143] libmachine: Error dialing TCP: ssh: handshake failed: EOF
	I1124 10:23:12.034495 1904000 main.go:143] libmachine: SSH cmd err, output: <nil>: no-preload-164104
	
	I1124 10:23:12.034521 1904000 ubuntu.go:182] provisioning hostname "no-preload-164104"
	I1124 10:23:12.034631 1904000 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" no-preload-164104
	I1124 10:23:12.055228 1904000 main.go:143] libmachine: Using SSH client type: native
	I1124 10:23:12.055579 1904000 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34964 <nil> <nil>}
	I1124 10:23:12.055597 1904000 main.go:143] libmachine: About to run SSH command:
	sudo hostname no-preload-164104 && echo "no-preload-164104" | sudo tee /etc/hostname
	I1124 10:23:12.216315 1904000 main.go:143] libmachine: SSH cmd err, output: <nil>: no-preload-164104
	
	I1124 10:23:12.216438 1904000 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" no-preload-164104
	I1124 10:23:12.234552 1904000 main.go:143] libmachine: Using SSH client type: native
	I1124 10:23:12.234887 1904000 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 34964 <nil> <nil>}
	I1124 10:23:12.234909 1904000 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sno-preload-164104' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 no-preload-164104/g' /etc/hosts;
				else 
					echo '127.0.1.1 no-preload-164104' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1124 10:23:12.386802 1904000 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1124 10:23:12.386828 1904000 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/21978-1652607/.minikube CaCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/21978-1652607/.minikube}
	I1124 10:23:12.386861 1904000 ubuntu.go:190] setting up certificates
	I1124 10:23:12.386872 1904000 provision.go:84] configureAuth start
	I1124 10:23:12.386955 1904000 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" no-preload-164104
	I1124 10:23:12.409648 1904000 provision.go:143] copyHostCerts
	I1124 10:23:12.409709 1904000 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem, removing ...
	I1124 10:23:12.409727 1904000 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem
	I1124 10:23:12.409808 1904000 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.pem (1078 bytes)
	I1124 10:23:12.409901 1904000 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem, removing ...
	I1124 10:23:12.409906 1904000 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem
	I1124 10:23:12.409931 1904000 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/cert.pem (1123 bytes)
	I1124 10:23:12.409988 1904000 exec_runner.go:144] found /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem, removing ...
	I1124 10:23:12.409993 1904000 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem
	I1124 10:23:12.410016 1904000 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/21978-1652607/.minikube/key.pem (1679 bytes)
	I1124 10:23:12.410070 1904000 provision.go:117] generating server cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem org=jenkins.no-preload-164104 san=[127.0.0.1 192.168.85.2 localhost minikube no-preload-164104]
	I1124 10:23:12.608711 1904000 provision.go:177] copyRemoteCerts
	I1124 10:23:12.608790 1904000 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1124 10:23:12.608863 1904000 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" no-preload-164104
	I1124 10:23:12.627028 1904000 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34964 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/no-preload-164104/id_rsa Username:docker}
	I1124 10:23:12.735883 1904000 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1124 10:23:12.755958 1904000 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1124 10:23:12.774013 1904000 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I1124 10:23:12.791909 1904000 provision.go:87] duration metric: took 405.008997ms to configureAuth
	I1124 10:23:12.791934 1904000 ubuntu.go:206] setting minikube options for container-runtime
	I1124 10:23:12.792113 1904000 config.go:182] Loaded profile config "no-preload-164104": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1124 10:23:12.792119 1904000 machine.go:97] duration metric: took 3.933946194s to provisionDockerMachine
	I1124 10:23:12.792126 1904000 client.go:176] duration metric: took 5.756069547s to LocalClient.Create
	I1124 10:23:12.792149 1904000 start.go:167] duration metric: took 5.756160035s to libmachine.API.Create "no-preload-164104"
	I1124 10:23:12.792158 1904000 start.go:293] postStartSetup for "no-preload-164104" (driver="docker")
	I1124 10:23:12.792168 1904000 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1124 10:23:12.792228 1904000 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1124 10:23:12.792277 1904000 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" no-preload-164104
	I1124 10:23:12.809408 1904000 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34964 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/no-preload-164104/id_rsa Username:docker}
	I1124 10:23:12.916100 1904000 ssh_runner.go:195] Run: cat /etc/os-release
	I1124 10:23:12.919953 1904000 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1124 10:23:12.919987 1904000 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1124 10:23:12.919999 1904000 filesync.go:126] Scanning /home/jenkins/minikube-integration/21978-1652607/.minikube/addons for local assets ...
	I1124 10:23:12.920073 1904000 filesync.go:126] Scanning /home/jenkins/minikube-integration/21978-1652607/.minikube/files for local assets ...
	I1124 10:23:12.920166 1904000 filesync.go:149] local asset: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem -> 16544672.pem in /etc/ssl/certs
	I1124 10:23:12.920275 1904000 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I1124 10:23:12.928838 1904000 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem --> /etc/ssl/certs/16544672.pem (1708 bytes)
	I1124 10:23:12.947215 1904000 start.go:296] duration metric: took 155.042424ms for postStartSetup
	I1124 10:23:12.947673 1904000 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" no-preload-164104
	I1124 10:23:12.965114 1904000 profile.go:143] Saving config to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/no-preload-164104/config.json ...
	I1124 10:23:12.965416 1904000 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1124 10:23:12.965475 1904000 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" no-preload-164104
	I1124 10:23:12.983297 1904000 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34964 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/no-preload-164104/id_rsa Username:docker}
	I1124 10:23:13.087847 1904000 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1124 10:23:13.092641 1904000 start.go:128] duration metric: took 6.060547429s to createHost
	I1124 10:23:13.092669 1904000 start.go:83] releasing machines lock for "no-preload-164104", held for 6.060713774s
	I1124 10:23:13.092742 1904000 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" no-preload-164104
	I1124 10:23:13.110360 1904000 ssh_runner.go:195] Run: cat /version.json
	I1124 10:23:13.110390 1904000 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1124 10:23:13.110414 1904000 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" no-preload-164104
	I1124 10:23:13.110549 1904000 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" no-preload-164104
	I1124 10:23:13.132106 1904000 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34964 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/no-preload-164104/id_rsa Username:docker}
	I1124 10:23:13.149091 1904000 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34964 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/no-preload-164104/id_rsa Username:docker}
	I1124 10:23:13.334951 1904000 ssh_runner.go:195] Run: systemctl --version
	I1124 10:23:13.341618 1904000 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1124 10:23:13.345991 1904000 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1124 10:23:13.346060 1904000 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1124 10:23:13.376406 1904000 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist, /etc/cni/net.d/10-crio-bridge.conflist.disabled] bridge cni config(s)
	I1124 10:23:13.376434 1904000 start.go:496] detecting cgroup driver to use...
	I1124 10:23:13.376467 1904000 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1124 10:23:13.376517 1904000 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1124 10:23:13.391557 1904000 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1124 10:23:13.404594 1904000 docker.go:218] disabling cri-docker service (if available) ...
	I1124 10:23:13.404654 1904000 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1124 10:23:13.422093 1904000 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1124 10:23:13.441023 1904000 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1124 10:23:13.560819 1904000 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1124 10:23:13.678105 1904000 docker.go:234] disabling docker service ...
	I1124 10:23:13.678184 1904000 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1124 10:23:13.700158 1904000 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1124 10:23:13.713427 1904000 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1124 10:23:13.832266 1904000 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1124 10:23:13.967664 1904000 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1124 10:23:13.980752 1904000 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1124 10:23:13.995040 1904000 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 10:23:14.165802 1904000 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1124 10:23:14.175503 1904000 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1124 10:23:14.184715 1904000 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1124 10:23:14.184790 1904000 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1124 10:23:14.193985 1904000 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1124 10:23:14.203121 1904000 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1124 10:23:14.212334 1904000 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1124 10:23:14.221773 1904000 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1124 10:23:14.230598 1904000 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1124 10:23:14.239745 1904000 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1124 10:23:14.248889 1904000 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1124 10:23:14.257943 1904000 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1124 10:23:14.265891 1904000 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1124 10:23:14.273415 1904000 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1124 10:23:14.393556 1904000 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1124 10:23:14.481565 1904000 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1124 10:23:14.481690 1904000 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1124 10:23:14.486051 1904000 start.go:564] Will wait 60s for crictl version
	I1124 10:23:14.486157 1904000 ssh_runner.go:195] Run: which crictl
	I1124 10:23:14.489753 1904000 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1124 10:23:14.515056 1904000 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.1.5
	RuntimeApiVersion:  v1
	I1124 10:23:14.515168 1904000 ssh_runner.go:195] Run: containerd --version
	I1124 10:23:14.538324 1904000 ssh_runner.go:195] Run: containerd --version
	I1124 10:23:14.565775 1904000 out.go:179] * Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	I1124 10:23:14.568730 1904000 cli_runner.go:164] Run: docker network inspect no-preload-164104 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1124 10:23:14.585063 1904000 ssh_runner.go:195] Run: grep 192.168.85.1	host.minikube.internal$ /etc/hosts
	I1124 10:23:14.589077 1904000 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.85.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1124 10:23:14.598964 1904000 kubeadm.go:884] updating cluster {Name:no-preload-164104 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:no-preload-164104 Namespace:default APIServerHAVIP: APIServerName:minikub
eCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.85.2 Port:8443 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:fal
se CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1124 10:23:14.599227 1904000 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 10:23:14.763695 1904000 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 10:23:14.918596 1904000 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 10:23:15.092395 1904000 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1124 10:23:15.092485 1904000 ssh_runner.go:195] Run: sudo crictl images --output json
	I1124 10:23:15.120551 1904000 containerd.go:623] couldn't find preloaded image for "registry.k8s.io/kube-apiserver:v1.35.0-beta.0". assuming images are not preloaded.
	I1124 10:23:15.120575 1904000 cache_images.go:90] LoadCachedImages start: [registry.k8s.io/kube-apiserver:v1.35.0-beta.0 registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 registry.k8s.io/kube-scheduler:v1.35.0-beta.0 registry.k8s.io/kube-proxy:v1.35.0-beta.0 registry.k8s.io/pause:3.10.1 registry.k8s.io/etcd:3.5.24-0 registry.k8s.io/coredns/coredns:v1.13.1 gcr.io/k8s-minikube/storage-provisioner:v5]
	I1124 10:23:15.120635 1904000 image.go:138] retrieving image: gcr.io/k8s-minikube/storage-provisioner:v5
	I1124 10:23:15.120900 1904000 image.go:138] retrieving image: registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1124 10:23:15.121029 1904000 image.go:138] retrieving image: registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1124 10:23:15.121114 1904000 image.go:138] retrieving image: registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1124 10:23:15.121221 1904000 image.go:138] retrieving image: registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1124 10:23:15.121315 1904000 image.go:138] retrieving image: registry.k8s.io/pause:3.10.1
	I1124 10:23:15.121398 1904000 image.go:138] retrieving image: registry.k8s.io/etcd:3.5.24-0
	I1124 10:23:15.121474 1904000 image.go:138] retrieving image: registry.k8s.io/coredns/coredns:v1.13.1
	I1124 10:23:15.124017 1904000 image.go:181] daemon lookup for registry.k8s.io/coredns/coredns:v1.13.1: Error response from daemon: No such image: registry.k8s.io/coredns/coredns:v1.13.1
	I1124 10:23:15.124255 1904000 image.go:181] daemon lookup for registry.k8s.io/kube-proxy:v1.35.0-beta.0: Error response from daemon: No such image: registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1124 10:23:15.124678 1904000 image.go:181] daemon lookup for gcr.io/k8s-minikube/storage-provisioner:v5: Error response from daemon: No such image: gcr.io/k8s-minikube/storage-provisioner:v5
	I1124 10:23:15.124026 1904000 image.go:181] daemon lookup for registry.k8s.io/kube-apiserver:v1.35.0-beta.0: Error response from daemon: No such image: registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1124 10:23:15.125559 1904000 image.go:181] daemon lookup for registry.k8s.io/kube-scheduler:v1.35.0-beta.0: Error response from daemon: No such image: registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1124 10:23:15.126305 1904000 image.go:181] daemon lookup for registry.k8s.io/pause:3.10.1: Error response from daemon: No such image: registry.k8s.io/pause:3.10.1
	I1124 10:23:15.126730 1904000 image.go:181] daemon lookup for registry.k8s.io/etcd:3.5.24-0: Error response from daemon: No such image: registry.k8s.io/etcd:3.5.24-0
	I1124 10:23:15.127065 1904000 image.go:181] daemon lookup for registry.k8s.io/kube-controller-manager:v1.35.0-beta.0: Error response from daemon: No such image: registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1124 10:23:15.466636 1904000 containerd.go:267] Checking existence of image with name "registry.k8s.io/pause:3.10.1" and sha "d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd"
	I1124 10:23:15.466732 1904000 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/pause:3.10.1
	I1124 10:23:15.484337 1904000 containerd.go:267] Checking existence of image with name "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" and sha "68b5f775f18769fcb77bd8474c80bda2050163b6c66f4551f352b7381b8ca5be"
	I1124 10:23:15.484422 1904000 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1124 10:23:15.491305 1904000 cache_images.go:118] "registry.k8s.io/pause:3.10.1" needs transfer: "registry.k8s.io/pause:3.10.1" does not exist at hash "d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd" in container runtime
	I1124 10:23:15.491347 1904000 cri.go:218] Removing image: registry.k8s.io/pause:3.10.1
	I1124 10:23:15.491396 1904000 ssh_runner.go:195] Run: which crictl
	I1124 10:23:15.492358 1904000 containerd.go:267] Checking existence of image with name "registry.k8s.io/coredns/coredns:v1.13.1" and sha "e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf"
	I1124 10:23:15.492454 1904000 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/coredns/coredns:v1.13.1
	I1124 10:23:15.504428 1904000 containerd.go:267] Checking existence of image with name "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" and sha "16378741539f1be9c6e347d127537d379a6592587b09b4eb47964cb5c43a409b"
	I1124 10:23:15.504555 1904000 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1124 10:23:15.505057 1904000 containerd.go:267] Checking existence of image with name "registry.k8s.io/kube-proxy:v1.35.0-beta.0" and sha "404c2e12861777b763b8feaa316d36680fc68ad308a8d2f6e55f1bb981cdd904"
	I1124 10:23:15.505146 1904000 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1124 10:23:15.505456 1904000 containerd.go:267] Checking existence of image with name "registry.k8s.io/etcd:3.5.24-0" and sha "1211402d28f5813ed906916bfcdd0a7404c2f9048ef5bb54387a6745bc410eca"
	I1124 10:23:15.505532 1904000 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/etcd:3.5.24-0
	I1124 10:23:15.516588 1904000 cache_images.go:118] "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" needs transfer: "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" does not exist at hash "68b5f775f18769fcb77bd8474c80bda2050163b6c66f4551f352b7381b8ca5be" in container runtime
	I1124 10:23:15.516662 1904000 cri.go:218] Removing image: registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1124 10:23:15.516729 1904000 ssh_runner.go:195] Run: which crictl
	I1124 10:23:15.516804 1904000 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/pause:3.10.1
	I1124 10:23:15.523261 1904000 cache_images.go:118] "registry.k8s.io/coredns/coredns:v1.13.1" needs transfer: "registry.k8s.io/coredns/coredns:v1.13.1" does not exist at hash "e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf" in container runtime
	I1124 10:23:15.523308 1904000 cri.go:218] Removing image: registry.k8s.io/coredns/coredns:v1.13.1
	I1124 10:23:15.523369 1904000 ssh_runner.go:195] Run: which crictl
	I1124 10:23:15.533705 1904000 containerd.go:267] Checking existence of image with name "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" and sha "ccd634d9bcc36ac6235e9c86676cd3a02c06afc3788a25f1bbf39ca7d44585f4"
	I1124 10:23:15.533780 1904000 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1124 10:23:15.557136 1904000 cache_images.go:118] "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" needs transfer: "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" does not exist at hash "16378741539f1be9c6e347d127537d379a6592587b09b4eb47964cb5c43a409b" in container runtime
	I1124 10:23:15.557183 1904000 cri.go:218] Removing image: registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1124 10:23:15.557239 1904000 ssh_runner.go:195] Run: which crictl
	I1124 10:23:15.582119 1904000 cache_images.go:118] "registry.k8s.io/etcd:3.5.24-0" needs transfer: "registry.k8s.io/etcd:3.5.24-0" does not exist at hash "1211402d28f5813ed906916bfcdd0a7404c2f9048ef5bb54387a6745bc410eca" in container runtime
	I1124 10:23:15.582234 1904000 cri.go:218] Removing image: registry.k8s.io/etcd:3.5.24-0
	I1124 10:23:15.582321 1904000 ssh_runner.go:195] Run: which crictl
	I1124 10:23:15.582419 1904000 cache_images.go:118] "registry.k8s.io/kube-proxy:v1.35.0-beta.0" needs transfer: "registry.k8s.io/kube-proxy:v1.35.0-beta.0" does not exist at hash "404c2e12861777b763b8feaa316d36680fc68ad308a8d2f6e55f1bb981cdd904" in container runtime
	I1124 10:23:15.582553 1904000 cri.go:218] Removing image: registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1124 10:23:15.582617 1904000 ssh_runner.go:195] Run: which crictl
	I1124 10:23:15.594019 1904000 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/pause:3.10.1
	I1124 10:23:15.594132 1904000 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/coredns/coredns:v1.13.1
	I1124 10:23:15.594166 1904000 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1124 10:23:15.599765 1904000 cache_images.go:118] "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" needs transfer: "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" does not exist at hash "ccd634d9bcc36ac6235e9c86676cd3a02c06afc3788a25f1bbf39ca7d44585f4" in container runtime
	I1124 10:23:15.599813 1904000 cri.go:218] Removing image: registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1124 10:23:15.599861 1904000 ssh_runner.go:195] Run: which crictl
	I1124 10:23:15.599945 1904000 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1124 10:23:15.600001 1904000 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/etcd:3.5.24-0
	I1124 10:23:15.602241 1904000 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1124 10:23:15.700838 1904000 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/coredns/coredns:v1.13.1
	I1124 10:23:15.700966 1904000 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/pause:3.10.1
	I1124 10:23:15.701065 1904000 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1124 10:23:15.701180 1904000 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1124 10:23:15.701214 1904000 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1124 10:23:15.701253 1904000 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1124 10:23:15.701122 1904000 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/etcd:3.5.24-0
	I1124 10:23:15.769071 1904000 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/coredns/coredns:v1.13.1
	I1124 10:23:15.820058 1904000 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/etcd:3.5.24-0
	I1124 10:23:15.820177 1904000 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1
	I1124 10:23:15.820273 1904000 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/pause_3.10.1
	I1124 10:23:15.820366 1904000 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1124 10:23:15.820446 1904000 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1124 10:23:15.820537 1904000 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1124 10:23:15.820632 1904000 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1124 10:23:15.820705 1904000 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1
	I1124 10:23:15.820779 1904000 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/coredns_v1.13.1
	I1124 10:23:15.891690 1904000 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0
	I1124 10:23:15.891858 1904000 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/etcd_3.5.24-0
	I1124 10:23:15.896823 1904000 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0
	I1124 10:23:15.896945 1904000 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0
	I1124 10:23:15.897029 1904000 ssh_runner.go:352] existence check for /var/lib/minikube/images/pause_3.10.1: stat -c "%s %y" /var/lib/minikube/images/pause_3.10.1: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/pause_3.10.1': No such file or directory
	I1124 10:23:15.897063 1904000 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 --> /var/lib/minikube/images/pause_3.10.1 (268288 bytes)
	I1124 10:23:15.905174 1904000 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0
	I1124 10:23:15.905319 1904000 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1124 10:23:15.905374 1904000 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0
	I1124 10:23:15.905451 1904000 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0
	I1124 10:23:15.905503 1904000 ssh_runner.go:352] existence check for /var/lib/minikube/images/coredns_v1.13.1: stat -c "%s %y" /var/lib/minikube/images/coredns_v1.13.1: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/coredns_v1.13.1': No such file or directory
	I1124 10:23:15.905523 1904000 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 --> /var/lib/minikube/images/coredns_v1.13.1 (21178368 bytes)
	I1124 10:23:15.905676 1904000 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0
	I1124 10:23:15.912085 1904000 ssh_runner.go:352] existence check for /var/lib/minikube/images/etcd_3.5.24-0: stat -c "%s %y" /var/lib/minikube/images/etcd_3.5.24-0: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/etcd_3.5.24-0': No such file or directory
	I1124 10:23:15.912119 1904000 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0 --> /var/lib/minikube/images/etcd_3.5.24-0 (21895168 bytes)
	I1124 10:23:15.912179 1904000 ssh_runner.go:352] existence check for /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0: stat -c "%s %y" /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0': No such file or directory
	I1124 10:23:15.912197 1904000 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 --> /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0 (20672000 bytes)
	I1124 10:23:15.974728 1904000 ssh_runner.go:352] existence check for /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0: stat -c "%s %y" /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/kube-proxy_v1.35.0-beta.0': No such file or directory
	I1124 10:23:15.974825 1904000 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 --> /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0 (22432256 bytes)
	I1124 10:23:15.974937 1904000 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0
	I1124 10:23:15.975058 1904000 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0
	I1124 10:23:15.975139 1904000 ssh_runner.go:352] existence check for /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0: stat -c "%s %y" /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0': No such file or directory
	I1124 10:23:15.975180 1904000 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 --> /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0 (15401984 bytes)
	I1124 10:23:15.980480 1904000 containerd.go:285] Loading image: /var/lib/minikube/images/pause_3.10.1
	I1124 10:23:15.980623 1904000 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/pause_3.10.1
	I1124 10:23:16.054941 1904000 ssh_runner.go:352] existence check for /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0: stat -c "%s %y" /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0': No such file or directory
	I1124 10:23:16.054983 1904000 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 --> /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0 (24689152 bytes)
	I1124 10:23:16.276292 1904000 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 from cache
	I1124 10:23:16.444747 1904000 containerd.go:285] Loading image: /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0
	I1124 10:23:16.444861 1904000 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0
	W1124 10:23:16.503425 1904000 image.go:286] image gcr.io/k8s-minikube/storage-provisioner:v5 arch mismatch: want arm64 got amd64. fixing
	I1124 10:23:16.503658 1904000 containerd.go:267] Checking existence of image with name "gcr.io/k8s-minikube/storage-provisioner:v5" and sha "66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51"
	I1124 10:23:16.503738 1904000 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==gcr.io/k8s-minikube/storage-provisioner:v5
	I1124 10:23:17.759396 1904000 ssh_runner.go:235] Completed: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0: (1.314488367s)
	I1124 10:23:17.759441 1904000 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 from cache
	I1124 10:23:17.759479 1904000 containerd.go:285] Loading image: /var/lib/minikube/images/coredns_v1.13.1
	I1124 10:23:17.759507 1904000 ssh_runner.go:235] Completed: sudo ctr -n=k8s.io images ls name==gcr.io/k8s-minikube/storage-provisioner:v5: (1.255744218s)
	I1124 10:23:17.759534 1904000 cache_images.go:118] "gcr.io/k8s-minikube/storage-provisioner:v5" needs transfer: "gcr.io/k8s-minikube/storage-provisioner:v5" does not exist at hash "66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51" in container runtime
	I1124 10:23:17.759557 1904000 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/coredns_v1.13.1
	I1124 10:23:17.759571 1904000 cri.go:218] Removing image: gcr.io/k8s-minikube/storage-provisioner:v5
	I1124 10:23:17.759611 1904000 ssh_runner.go:195] Run: which crictl
	I1124 10:23:18.809526 1904000 ssh_runner.go:235] Completed: sudo ctr -n=k8s.io images import /var/lib/minikube/images/coredns_v1.13.1: (1.049944083s)
	I1124 10:23:18.809550 1904000 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 from cache
	I1124 10:23:18.809567 1904000 containerd.go:285] Loading image: /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0
	I1124 10:23:18.809570 1904000 ssh_runner.go:235] Completed: which crictl: (1.049943246s)
	I1124 10:23:18.809613 1904000 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0
	I1124 10:23:18.809623 1904000 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi gcr.io/k8s-minikube/storage-provisioner:v5
	I1124 10:23:19.747331 1904000 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 from cache
	I1124 10:23:19.747367 1904000 containerd.go:285] Loading image: /var/lib/minikube/images/etcd_3.5.24-0
	I1124 10:23:19.747422 1904000 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/etcd_3.5.24-0
	I1124 10:23:19.747475 1904000 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi gcr.io/k8s-minikube/storage-provisioner:v5
	I1124 10:23:21.197865 1904000 ssh_runner.go:235] Completed: sudo ctr -n=k8s.io images import /var/lib/minikube/images/etcd_3.5.24-0: (1.450420126s)
	I1124 10:23:21.197896 1904000 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.5.24-0 from cache
	I1124 10:23:21.197915 1904000 containerd.go:285] Loading image: /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0
	I1124 10:23:21.197970 1904000 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0
	I1124 10:23:21.198033 1904000 ssh_runner.go:235] Completed: sudo /usr/local/bin/crictl rmi gcr.io/k8s-minikube/storage-provisioner:v5: (1.450541949s)
	I1124 10:23:21.198070 1904000 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi gcr.io/k8s-minikube/storage-provisioner:v5
	I1124 10:23:22.167547 1904000 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5
	I1124 10:23:22.167652 1904000 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/storage-provisioner_v5
	I1124 10:23:22.167698 1904000 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 from cache
	I1124 10:23:22.167712 1904000 containerd.go:285] Loading image: /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0
	I1124 10:23:22.167736 1904000 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0
	I1124 10:23:23.256642 1904000 ssh_runner.go:235] Completed: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0: (1.088882371s)
	I1124 10:23:23.256678 1904000 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 from cache
	I1124 10:23:23.256711 1904000 ssh_runner.go:235] Completed: stat -c "%s %y" /var/lib/minikube/images/storage-provisioner_v5: (1.0890502s)
	I1124 10:23:23.256725 1904000 ssh_runner.go:352] existence check for /var/lib/minikube/images/storage-provisioner_v5: stat -c "%s %y" /var/lib/minikube/images/storage-provisioner_v5: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/storage-provisioner_v5': No such file or directory
	I1124 10:23:23.256751 1904000 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 --> /var/lib/minikube/images/storage-provisioner_v5 (8035840 bytes)
	I1124 10:23:23.339956 1904000 containerd.go:285] Loading image: /var/lib/minikube/images/storage-provisioner_v5
	I1124 10:23:23.340031 1904000 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/storage-provisioner_v5
	I1124 10:23:23.718354 1904000 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 from cache
	I1124 10:23:23.718402 1904000 cache_images.go:125] Successfully loaded all cached images
	I1124 10:23:23.718408 1904000 cache_images.go:94] duration metric: took 8.59781746s to LoadCachedImages
	I1124 10:23:23.718420 1904000 kubeadm.go:935] updating node { 192.168.85.2 8443 v1.35.0-beta.0 containerd true true} ...
	I1124 10:23:23.718603 1904000 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=no-preload-164104 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.85.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-beta.0 ClusterName:no-preload-164104 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1124 10:23:23.718681 1904000 ssh_runner.go:195] Run: sudo crictl info
	I1124 10:23:23.746822 1904000 cni.go:84] Creating CNI manager for ""
	I1124 10:23:23.746850 1904000 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1124 10:23:23.746869 1904000 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1124 10:23:23.746892 1904000 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.85.2 APIServerPort:8443 KubernetesVersion:v1.35.0-beta.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:no-preload-164104 NodeName:no-preload-164104 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.85.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.85.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt Sta
ticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1124 10:23:23.747042 1904000 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.85.2
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "no-preload-164104"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.85.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.85.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-beta.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1124 10:23:23.747121 1904000 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0
	I1124 10:23:23.754996 1904000 binaries.go:54] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.35.0-beta.0': No such file or directory
	
	Initiating transfer...
	I1124 10:23:23.755063 1904000 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.35.0-beta.0
	I1124 10:23:23.762440 1904000 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubectl?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubectl.sha256
	I1124 10:23:23.762595 1904000 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl
	I1124 10:23:23.762690 1904000 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubelet?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubelet.sha256
	I1124 10:23:23.762721 1904000 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1124 10:23:23.762798 1904000 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1124 10:23:23.762849 1904000 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubeadm
	I1124 10:23:23.768273 1904000 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.35.0-beta.0/kubectl': No such file or directory
	I1124 10:23:23.768304 1904000 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/linux/arm64/v1.35.0-beta.0/kubectl --> /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl (55181496 bytes)
	I1124 10:23:23.784348 1904000 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.35.0-beta.0/kubeadm: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubeadm: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.35.0-beta.0/kubeadm': No such file or directory
	I1124 10:23:23.784386 1904000 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/linux/arm64/v1.35.0-beta.0/kubeadm --> /var/lib/minikube/binaries/v1.35.0-beta.0/kubeadm (68354232 bytes)
	I1124 10:23:23.784555 1904000 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubelet
	I1124 10:23:23.806548 1904000 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.35.0-beta.0/kubelet: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubelet: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet': No such file or directory
	I1124 10:23:23.806587 1904000 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/linux/arm64/v1.35.0-beta.0/kubelet --> /var/lib/minikube/binaries/v1.35.0-beta.0/kubelet (54329636 bytes)
	I1124 10:23:24.637753 1904000 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1124 10:23:24.646112 1904000 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (328 bytes)
	I1124 10:23:24.660323 1904000 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (359 bytes)
	I1124 10:23:24.675200 1904000 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2237 bytes)
	I1124 10:23:24.689730 1904000 ssh_runner.go:195] Run: grep 192.168.85.2	control-plane.minikube.internal$ /etc/hosts
	I1124 10:23:24.693784 1904000 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.85.2	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1124 10:23:24.704215 1904000 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1124 10:23:24.829331 1904000 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1124 10:23:24.855007 1904000 certs.go:69] Setting up /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/no-preload-164104 for IP: 192.168.85.2
	I1124 10:23:24.855031 1904000 certs.go:195] generating shared ca certs ...
	I1124 10:23:24.855049 1904000 certs.go:227] acquiring lock for ca certs: {Name:mkbe540a30c4376a351176f7fe6fec044d058b09 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 10:23:24.855262 1904000 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.key
	I1124 10:23:24.855333 1904000 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.key
	I1124 10:23:24.855347 1904000 certs.go:257] generating profile certs ...
	I1124 10:23:24.855422 1904000 certs.go:364] generating signed profile cert for "minikube-user": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/no-preload-164104/client.key
	I1124 10:23:24.855469 1904000 crypto.go:68] Generating cert /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/no-preload-164104/client.crt with IP's: []
	I1124 10:23:25.421467 1904000 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/no-preload-164104/client.crt ...
	I1124 10:23:25.421498 1904000 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/no-preload-164104/client.crt: {Name:mkd76a3c6fc2be8e6fb6ba52fef4c2a7f5dff9af Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 10:23:25.421729 1904000 crypto.go:164] Writing key to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/no-preload-164104/client.key ...
	I1124 10:23:25.421744 1904000 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/no-preload-164104/client.key: {Name:mk4b814935f61b2bf57ee064e0e629532a116176 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 10:23:25.421847 1904000 certs.go:364] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/no-preload-164104/apiserver.key.6b752d89
	I1124 10:23:25.421864 1904000 crypto.go:68] Generating cert /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/no-preload-164104/apiserver.crt.6b752d89 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.85.2]
	I1124 10:23:26.007303 1904000 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/no-preload-164104/apiserver.crt.6b752d89 ...
	I1124 10:23:26.007343 1904000 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/no-preload-164104/apiserver.crt.6b752d89: {Name:mk6b61a7f4228cc21b3a8579cc7dbdb3a1edfb47 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 10:23:26.007579 1904000 crypto.go:164] Writing key to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/no-preload-164104/apiserver.key.6b752d89 ...
	I1124 10:23:26.007590 1904000 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/no-preload-164104/apiserver.key.6b752d89: {Name:mk324cdfed92c67cb2aa62a3f2670c086bd67efb Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 10:23:26.007676 1904000 certs.go:382] copying /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/no-preload-164104/apiserver.crt.6b752d89 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/no-preload-164104/apiserver.crt
	I1124 10:23:26.007759 1904000 certs.go:386] copying /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/no-preload-164104/apiserver.key.6b752d89 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/no-preload-164104/apiserver.key
	I1124 10:23:26.007821 1904000 certs.go:364] generating signed profile cert for "aggregator": /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/no-preload-164104/proxy-client.key
	I1124 10:23:26.007836 1904000 crypto.go:68] Generating cert /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/no-preload-164104/proxy-client.crt with IP's: []
	I1124 10:23:26.354408 1904000 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/no-preload-164104/proxy-client.crt ...
	I1124 10:23:26.354443 1904000 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/no-preload-164104/proxy-client.crt: {Name:mka5e76e227bba97dc6454864f59808a9b964186 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 10:23:26.354646 1904000 crypto.go:164] Writing key to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/no-preload-164104/proxy-client.key ...
	I1124 10:23:26.354661 1904000 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/no-preload-164104/proxy-client.key: {Name:mkda091d2eb6dc70cf76cd03b0a302cff9547b22 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1124 10:23:26.354870 1904000 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467.pem (1338 bytes)
	W1124 10:23:26.354922 1904000 certs.go:480] ignoring /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467_empty.pem, impossibly tiny 0 bytes
	I1124 10:23:26.354932 1904000 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca-key.pem (1671 bytes)
	I1124 10:23:26.354959 1904000 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/ca.pem (1078 bytes)
	I1124 10:23:26.354988 1904000 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/cert.pem (1123 bytes)
	I1124 10:23:26.355015 1904000 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/key.pem (1679 bytes)
	I1124 10:23:26.355075 1904000 certs.go:484] found cert: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem (1708 bytes)
	I1124 10:23:26.355689 1904000 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1124 10:23:26.374035 1904000 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1124 10:23:26.396441 1904000 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1124 10:23:26.417740 1904000 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1124 10:23:26.437326 1904000 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/no-preload-164104/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1124 10:23:26.461538 1904000 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/no-preload-164104/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I1124 10:23:26.479413 1904000 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/no-preload-164104/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1124 10:23:26.497367 1904000 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/no-preload-164104/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1124 10:23:26.514999 1904000 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/ssl/certs/16544672.pem --> /usr/share/ca-certificates/16544672.pem (1708 bytes)
	I1124 10:23:26.532113 1904000 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1124 10:23:26.548927 1904000 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21978-1652607/.minikube/certs/1654467.pem --> /usr/share/ca-certificates/1654467.pem (1338 bytes)
	I1124 10:23:26.566408 1904000 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1124 10:23:26.579352 1904000 ssh_runner.go:195] Run: openssl version
	I1124 10:23:26.585781 1904000 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/16544672.pem && ln -fs /usr/share/ca-certificates/16544672.pem /etc/ssl/certs/16544672.pem"
	I1124 10:23:26.594338 1904000 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/16544672.pem
	I1124 10:23:26.598449 1904000 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Nov 24 09:10 /usr/share/ca-certificates/16544672.pem
	I1124 10:23:26.598609 1904000 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/16544672.pem
	I1124 10:23:26.639976 1904000 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/16544672.pem /etc/ssl/certs/3ec20f2e.0"
	I1124 10:23:26.649431 1904000 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1124 10:23:26.658420 1904000 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1124 10:23:26.663253 1904000 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Nov 24 08:44 /usr/share/ca-certificates/minikubeCA.pem
	I1124 10:23:26.663327 1904000 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1124 10:23:26.705339 1904000 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1124 10:23:26.713744 1904000 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1654467.pem && ln -fs /usr/share/ca-certificates/1654467.pem /etc/ssl/certs/1654467.pem"
	I1124 10:23:26.721984 1904000 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1654467.pem
	I1124 10:23:26.726018 1904000 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Nov 24 09:10 /usr/share/ca-certificates/1654467.pem
	I1124 10:23:26.726086 1904000 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1654467.pem
	I1124 10:23:26.767364 1904000 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1654467.pem /etc/ssl/certs/51391683.0"
	I1124 10:23:26.775708 1904000 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1124 10:23:26.779353 1904000 certs.go:400] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I1124 10:23:26.779406 1904000 kubeadm.go:401] StartCluster: {Name:no-preload-164104 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:no-preload-164104 Namespace:default APIServerHAVIP: APIServerName:minikubeCA
APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.85.2 Port:8443 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false
CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 10:23:26.779507 1904000 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1124 10:23:26.779575 1904000 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1124 10:23:26.803842 1904000 cri.go:89] found id: ""
	I1124 10:23:26.803961 1904000 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1124 10:23:26.812023 1904000 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1124 10:23:26.819867 1904000 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1124 10:23:26.819948 1904000 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1124 10:23:26.827759 1904000 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1124 10:23:26.827779 1904000 kubeadm.go:158] found existing configuration files:
	
	I1124 10:23:26.827830 1904000 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I1124 10:23:26.835450 1904000 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1124 10:23:26.835564 1904000 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1124 10:23:26.843120 1904000 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I1124 10:23:26.851212 1904000 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1124 10:23:26.851320 1904000 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1124 10:23:26.858928 1904000 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I1124 10:23:26.866754 1904000 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1124 10:23:26.866860 1904000 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1124 10:23:26.874490 1904000 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I1124 10:23:26.882193 1904000 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1124 10:23:26.882255 1904000 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1124 10:23:26.890726 1904000 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1124 10:23:26.927251 1904000 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-beta.0
	I1124 10:23:26.927357 1904000 kubeadm.go:319] [preflight] Running pre-flight checks
	I1124 10:23:26.998819 1904000 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1124 10:23:26.998900 1904000 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1124 10:23:26.998939 1904000 kubeadm.go:319] OS: Linux
	I1124 10:23:26.998988 1904000 kubeadm.go:319] CGROUPS_CPU: enabled
	I1124 10:23:26.999040 1904000 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1124 10:23:26.999091 1904000 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1124 10:23:26.999145 1904000 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1124 10:23:26.999196 1904000 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1124 10:23:26.999247 1904000 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1124 10:23:26.999295 1904000 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1124 10:23:26.999348 1904000 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1124 10:23:26.999397 1904000 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1124 10:23:27.065923 1904000 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1124 10:23:27.066106 1904000 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1124 10:23:27.066254 1904000 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1124 10:23:29.453134 1904000 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1124 10:23:30.622032 1856079 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.001254795s
	I1124 10:23:30.622067 1856079 kubeadm.go:319] 
	I1124 10:23:30.622121 1856079 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1124 10:23:30.622152 1856079 kubeadm.go:319] 	- The kubelet is not running
	I1124 10:23:30.622251 1856079 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1124 10:23:30.622257 1856079 kubeadm.go:319] 
	I1124 10:23:30.622355 1856079 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1124 10:23:30.622385 1856079 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1124 10:23:30.622415 1856079 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1124 10:23:30.622419 1856079 kubeadm.go:319] 
	I1124 10:23:30.626605 1856079 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1124 10:23:30.627034 1856079 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1124 10:23:30.627143 1856079 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1124 10:23:30.627378 1856079 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1124 10:23:30.627383 1856079 kubeadm.go:319] 
	I1124 10:23:30.627461 1856079 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1124 10:23:30.627514 1856079 kubeadm.go:403] duration metric: took 12m8.973699951s to StartCluster
	I1124 10:23:30.627561 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1124 10:23:30.627621 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1124 10:23:30.664923 1856079 cri.go:89] found id: ""
	I1124 10:23:30.664944 1856079 logs.go:282] 0 containers: []
	W1124 10:23:30.664953 1856079 logs.go:284] No container was found matching "kube-apiserver"
	I1124 10:23:30.664959 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1124 10:23:30.665018 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1124 10:23:30.701134 1856079 cri.go:89] found id: ""
	I1124 10:23:30.701156 1856079 logs.go:282] 0 containers: []
	W1124 10:23:30.701164 1856079 logs.go:284] No container was found matching "etcd"
	I1124 10:23:30.701171 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1124 10:23:30.701233 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1124 10:23:30.735694 1856079 cri.go:89] found id: ""
	I1124 10:23:30.735716 1856079 logs.go:282] 0 containers: []
	W1124 10:23:30.735725 1856079 logs.go:284] No container was found matching "coredns"
	I1124 10:23:30.735731 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1124 10:23:30.735789 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1124 10:23:30.767541 1856079 cri.go:89] found id: ""
	I1124 10:23:30.767562 1856079 logs.go:282] 0 containers: []
	W1124 10:23:30.767571 1856079 logs.go:284] No container was found matching "kube-scheduler"
	I1124 10:23:30.767582 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1124 10:23:30.767639 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1124 10:23:30.797084 1856079 cri.go:89] found id: ""
	I1124 10:23:30.797106 1856079 logs.go:282] 0 containers: []
	W1124 10:23:30.797114 1856079 logs.go:284] No container was found matching "kube-proxy"
	I1124 10:23:30.797121 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1124 10:23:30.797178 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1124 10:23:30.833027 1856079 cri.go:89] found id: ""
	I1124 10:23:30.833099 1856079 logs.go:282] 0 containers: []
	W1124 10:23:30.833170 1856079 logs.go:284] No container was found matching "kube-controller-manager"
	I1124 10:23:30.833195 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1124 10:23:30.833288 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1124 10:23:30.867701 1856079 cri.go:89] found id: ""
	I1124 10:23:30.867774 1856079 logs.go:282] 0 containers: []
	W1124 10:23:30.867798 1856079 logs.go:284] No container was found matching "kindnet"
	I1124 10:23:30.867819 1856079 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1124 10:23:30.867911 1856079 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1124 10:23:30.903593 1856079 cri.go:89] found id: ""
	I1124 10:23:30.903665 1856079 logs.go:282] 0 containers: []
	W1124 10:23:30.903688 1856079 logs.go:284] No container was found matching "storage-provisioner"
	I1124 10:23:30.903713 1856079 logs.go:123] Gathering logs for kubelet ...
	I1124 10:23:30.903758 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1124 10:23:30.982104 1856079 logs.go:123] Gathering logs for dmesg ...
	I1124 10:23:30.982186 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1124 10:23:31.000228 1856079 logs.go:123] Gathering logs for describe nodes ...
	I1124 10:23:31.000255 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1124 10:23:31.092172 1856079 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1124 10:23:31.092203 1856079 logs.go:123] Gathering logs for containerd ...
	I1124 10:23:31.092216 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1124 10:23:31.144366 1856079 logs.go:123] Gathering logs for container status ...
	I1124 10:23:31.144410 1856079 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	W1124 10:23:31.217122 1856079 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001254795s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	W1124 10:23:31.217172 1856079 out.go:285] * 
	W1124 10:23:31.217227 1856079 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001254795s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1124 10:23:31.217245 1856079 out.go:285] * 
	W1124 10:23:31.219547 1856079 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1124 10:23:31.229855 1856079 out.go:203] 
	W1124 10:23:31.233484 1856079 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001254795s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1124 10:23:31.233540 1856079 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1124 10:23:31.233566 1856079 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1124 10:23:31.237388 1856079 out.go:203] 
	I1124 10:23:29.509874 1904000 out.go:252]   - Generating certificates and keys ...
	I1124 10:23:29.509975 1904000 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1124 10:23:29.510041 1904000 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1124 10:23:29.754994 1904000 kubeadm.go:319] [certs] Generating "apiserver-kubelet-client" certificate and key
	I1124 10:23:29.827639 1904000 kubeadm.go:319] [certs] Generating "front-proxy-ca" certificate and key
	I1124 10:23:29.982318 1904000 kubeadm.go:319] [certs] Generating "front-proxy-client" certificate and key
	I1124 10:23:30.050334 1904000 kubeadm.go:319] [certs] Generating "etcd/ca" certificate and key
	I1124 10:23:30.230534 1904000 kubeadm.go:319] [certs] Generating "etcd/server" certificate and key
	I1124 10:23:30.230907 1904000 kubeadm.go:319] [certs] etcd/server serving cert is signed for DNS names [localhost no-preload-164104] and IPs [192.168.85.2 127.0.0.1 ::1]
	I1124 10:23:30.500532 1904000 kubeadm.go:319] [certs] Generating "etcd/peer" certificate and key
	I1124 10:23:30.500922 1904000 kubeadm.go:319] [certs] etcd/peer serving cert is signed for DNS names [localhost no-preload-164104] and IPs [192.168.85.2 127.0.0.1 ::1]
	I1124 10:23:30.891287 1904000 kubeadm.go:319] [certs] Generating "etcd/healthcheck-client" certificate and key
	I1124 10:23:31.081786 1904000 kubeadm.go:319] [certs] Generating "apiserver-etcd-client" certificate and key
	I1124 10:23:31.316969 1904000 kubeadm.go:319] [certs] Generating "sa" key and public key
	I1124 10:23:31.317047 1904000 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1124 10:23:31.570863 1904000 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1124 10:23:31.726839 1904000 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Nov 24 10:15:25 kubernetes-upgrade-188777 containerd[554]: time="2025-11-24T10:15:25.226686498Z" level=info msg="StopPodSandbox for \"c812126488969c8950dbf34d1e9947a838baca78391a8f29476bc81ef0903b2d\""
	Nov 24 10:15:25 kubernetes-upgrade-188777 containerd[554]: time="2025-11-24T10:15:25.226759729Z" level=info msg="Container to stop \"6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5\" must be in running or unknown state, current state \"CONTAINER_EXITED\""
	Nov 24 10:15:25 kubernetes-upgrade-188777 containerd[554]: time="2025-11-24T10:15:25.227141616Z" level=info msg="TearDown network for sandbox \"c812126488969c8950dbf34d1e9947a838baca78391a8f29476bc81ef0903b2d\" successfully"
	Nov 24 10:15:25 kubernetes-upgrade-188777 containerd[554]: time="2025-11-24T10:15:25.227190880Z" level=info msg="StopPodSandbox for \"c812126488969c8950dbf34d1e9947a838baca78391a8f29476bc81ef0903b2d\" returns successfully"
	Nov 24 10:15:25 kubernetes-upgrade-188777 containerd[554]: time="2025-11-24T10:15:25.227480762Z" level=info msg="RemovePodSandbox for \"c812126488969c8950dbf34d1e9947a838baca78391a8f29476bc81ef0903b2d\""
	Nov 24 10:15:25 kubernetes-upgrade-188777 containerd[554]: time="2025-11-24T10:15:25.227513526Z" level=info msg="Forcibly stopping sandbox \"c812126488969c8950dbf34d1e9947a838baca78391a8f29476bc81ef0903b2d\""
	Nov 24 10:15:25 kubernetes-upgrade-188777 containerd[554]: time="2025-11-24T10:15:25.227545928Z" level=info msg="Container to stop \"6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5\" must be in running or unknown state, current state \"CONTAINER_EXITED\""
	Nov 24 10:15:25 kubernetes-upgrade-188777 containerd[554]: time="2025-11-24T10:15:25.227938417Z" level=info msg="TearDown network for sandbox \"c812126488969c8950dbf34d1e9947a838baca78391a8f29476bc81ef0903b2d\" successfully"
	Nov 24 10:15:25 kubernetes-upgrade-188777 containerd[554]: time="2025-11-24T10:15:25.235970632Z" level=info msg="Ensure that sandbox c812126488969c8950dbf34d1e9947a838baca78391a8f29476bc81ef0903b2d in task-service has been cleanup successfully"
	Nov 24 10:15:25 kubernetes-upgrade-188777 containerd[554]: time="2025-11-24T10:15:25.242776078Z" level=info msg="RemovePodSandbox \"c812126488969c8950dbf34d1e9947a838baca78391a8f29476bc81ef0903b2d\" returns successfully"
	Nov 24 10:15:25 kubernetes-upgrade-188777 containerd[554]: time="2025-11-24T10:15:25.625936533Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.5-0\""
	Nov 24 10:15:27 kubernetes-upgrade-188777 containerd[554]: time="2025-11-24T10:15:27.389848518Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.5-0\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 10:15:27 kubernetes-upgrade-188777 containerd[554]: time="2025-11-24T10:15:27.392724963Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.5-0: active requests=0, bytes read=21124062"
	Nov 24 10:15:27 kubernetes-upgrade-188777 containerd[554]: time="2025-11-24T10:15:27.395321815Z" level=info msg="ImageCreate event name:\"sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 10:15:27 kubernetes-upgrade-188777 containerd[554]: time="2025-11-24T10:15:27.399718292Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Nov 24 10:15:27 kubernetes-upgrade-188777 containerd[554]: time="2025-11-24T10:15:27.401088760Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.5-0\" with image id \"sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42\", repo tag \"registry.k8s.io/etcd:3.6.5-0\", repo digest \"registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534\", size \"21136588\" in 1.775113769s"
	Nov 24 10:15:27 kubernetes-upgrade-188777 containerd[554]: time="2025-11-24T10:15:27.401584642Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.5-0\" returns image reference \"sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42\""
	Nov 24 10:20:25 kubernetes-upgrade-188777 containerd[554]: time="2025-11-24T10:20:25.186127369Z" level=info msg="container event discarded" container=b1db32a819d84760b89ef3fe1c5d0523c0c38df0b0e7702da99edf0bd76cfd0f type=CONTAINER_DELETED_EVENT
	Nov 24 10:20:25 kubernetes-upgrade-188777 containerd[554]: time="2025-11-24T10:20:25.200425053Z" level=info msg="container event discarded" container=81faa460a9ae889edf02d8c992724b5faaa8057490a024bc89ac28c1fd2c3495 type=CONTAINER_DELETED_EVENT
	Nov 24 10:20:25 kubernetes-upgrade-188777 containerd[554]: time="2025-11-24T10:20:25.212673658Z" level=info msg="container event discarded" container=c3ef546da7d8f7eca3caf4bb59cb9c3faf64bfe4d63c04ee275ae485554a5c06 type=CONTAINER_DELETED_EVENT
	Nov 24 10:20:25 kubernetes-upgrade-188777 containerd[554]: time="2025-11-24T10:20:25.212725892Z" level=info msg="container event discarded" container=46926d57f873fec463a4be2c6c0ed57c1c3da2fe2e5de956e8241f30ab0137d0 type=CONTAINER_DELETED_EVENT
	Nov 24 10:20:25 kubernetes-upgrade-188777 containerd[554]: time="2025-11-24T10:20:25.228928286Z" level=info msg="container event discarded" container=9f447bed0be302947c33bc5b1413f24faa0c03cd6e32dc24768cafa9c1255834 type=CONTAINER_DELETED_EVENT
	Nov 24 10:20:25 kubernetes-upgrade-188777 containerd[554]: time="2025-11-24T10:20:25.228972455Z" level=info msg="container event discarded" container=9e33292d40bc969bca8dc9b4bfec5481cabc530c68acc9f5b0af867d81e6e4e4 type=CONTAINER_DELETED_EVENT
	Nov 24 10:20:25 kubernetes-upgrade-188777 containerd[554]: time="2025-11-24T10:20:25.246190047Z" level=info msg="container event discarded" container=6cb29aefe543d4eefbe37bbd91b40f35133afa7e9cb727c644b1a39eb3f13fd5 type=CONTAINER_DELETED_EVENT
	Nov 24 10:20:25 kubernetes-upgrade-188777 containerd[554]: time="2025-11-24T10:20:25.246251398Z" level=info msg="container event discarded" container=c812126488969c8950dbf34d1e9947a838baca78391a8f29476bc81ef0903b2d type=CONTAINER_DELETED_EVENT
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Nov24 08:20] overlayfs: idmapped layers are currently not supported
	[Nov24 08:21] overlayfs: idmapped layers are currently not supported
	[Nov24 08:22] overlayfs: idmapped layers are currently not supported
	[Nov24 08:23] overlayfs: idmapped layers are currently not supported
	[Nov24 08:24] overlayfs: idmapped layers are currently not supported
	[ +19.566509] overlayfs: idmapped layers are currently not supported
	[Nov24 08:25] overlayfs: idmapped layers are currently not supported
	[ +35.934095] overlayfs: idmapped layers are currently not supported
	[Nov24 08:27] overlayfs: idmapped layers are currently not supported
	[Nov24 08:28] overlayfs: idmapped layers are currently not supported
	[Nov24 08:29] overlayfs: idmapped layers are currently not supported
	[Nov24 08:30] overlayfs: idmapped layers are currently not supported
	[Nov24 08:31] overlayfs: idmapped layers are currently not supported
	[ +44.505180] overlayfs: idmapped layers are currently not supported
	[Nov24 08:32] overlayfs: idmapped layers are currently not supported
	[Nov24 08:33] overlayfs: idmapped layers are currently not supported
	[Nov24 08:34] overlayfs: idmapped layers are currently not supported
	[ +21.137877] overlayfs: idmapped layers are currently not supported
	[ +28.724175] overlayfs: idmapped layers are currently not supported
	[Nov24 08:36] overlayfs: idmapped layers are currently not supported
	[Nov24 08:37] overlayfs: idmapped layers are currently not supported
	[Nov24 08:38] overlayfs: idmapped layers are currently not supported
	[  +4.063834] overlayfs: idmapped layers are currently not supported
	[Nov24 08:40] overlayfs: idmapped layers are currently not supported
	[Nov24 08:42] kauditd_printk_skb: 8 callbacks suppressed
	
	
	==> kernel <==
	 10:23:33 up  9:05,  0 user,  load average: 2.04, 1.81, 2.00
	Linux kubernetes-upgrade-188777 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Nov 24 10:23:30 kubernetes-upgrade-188777 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 10:23:31 kubernetes-upgrade-188777 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 320.
	Nov 24 10:23:31 kubernetes-upgrade-188777 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 10:23:31 kubernetes-upgrade-188777 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 10:23:31 kubernetes-upgrade-188777 kubelet[14264]: E1124 10:23:31.239389   14264 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 10:23:31 kubernetes-upgrade-188777 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 10:23:31 kubernetes-upgrade-188777 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 10:23:31 kubernetes-upgrade-188777 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 321.
	Nov 24 10:23:31 kubernetes-upgrade-188777 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 10:23:31 kubernetes-upgrade-188777 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 10:23:31 kubernetes-upgrade-188777 kubelet[14281]: E1124 10:23:31.976344   14281 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 10:23:31 kubernetes-upgrade-188777 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 10:23:31 kubernetes-upgrade-188777 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 10:23:32 kubernetes-upgrade-188777 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 322.
	Nov 24 10:23:32 kubernetes-upgrade-188777 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 10:23:32 kubernetes-upgrade-188777 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 10:23:32 kubernetes-upgrade-188777 kubelet[14289]: E1124 10:23:32.726525   14289 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 10:23:32 kubernetes-upgrade-188777 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 10:23:32 kubernetes-upgrade-188777 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Nov 24 10:23:33 kubernetes-upgrade-188777 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 323.
	Nov 24 10:23:33 kubernetes-upgrade-188777 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 10:23:33 kubernetes-upgrade-188777 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Nov 24 10:23:33 kubernetes-upgrade-188777 kubelet[14346]: E1124 10:23:33.478116   14346 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Nov 24 10:23:33 kubernetes-upgrade-188777 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Nov 24 10:23:33 kubernetes-upgrade-188777 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p kubernetes-upgrade-188777 -n kubernetes-upgrade-188777
helpers_test.go:262: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p kubernetes-upgrade-188777 -n kubernetes-upgrade-188777: exit status 2 (364.68388ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:262: status error: exit status 2 (may be ok)
helpers_test.go:264: "kubernetes-upgrade-188777" apiserver is not running, skipping kubectl commands (state="Stopped")
helpers_test.go:175: Cleaning up "kubernetes-upgrade-188777" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p kubernetes-upgrade-188777
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p kubernetes-upgrade-188777: (2.157114498s)
--- FAIL: TestKubernetesUpgrade (801.64s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/UserAppExistsAfterStop (7200.067s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/UserAppExistsAfterStop
start_stop_delete_test.go:272: (dbg) TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
E1124 10:41:03.604332 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
E1124 10:41:24.716666 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
E1124 10:41:30.856759 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/old-k8s-version-662905/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
E1124 10:42:36.339358 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/default-k8s-diff-port-335501/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
E1124 10:42:53.920819 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/old-k8s-version-662905/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.85.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.85.2:8443: connect: connection refused
panic: test timed out after 2h0m0s
	running tests:
		TestNetworkPlugins (27m44s)
		TestStartStop (30m6s)
		TestStartStop/group/newest-cni (13m44s)
		TestStartStop/group/newest-cni/serial (13m44s)
		TestStartStop/group/newest-cni/serial/SecondStart (3m41s)
		TestStartStop/group/no-preload (19m58s)
		TestStartStop/group/no-preload/serial (19m58s)
		TestStartStop/group/no-preload/serial/UserAppExistsAfterStop (3m21s)

                                                
                                                
goroutine 5997 [running]:
testing.(*M).startAlarm.func1()
	/usr/local/go/src/testing/testing.go:2682 +0x2b0
created by time.goFunc
	/usr/local/go/src/time/sleep.go:215 +0x38

                                                
                                                
goroutine 1 [chan receive, 24 minutes]:
testing.tRunner.func1()
	/usr/local/go/src/testing/testing.go:1891 +0x3d0
testing.tRunner(0x4000358380, 0x40008e1bb8)
	/usr/local/go/src/testing/testing.go:1940 +0x104
testing.runTests(0x40002f22a0, {0x534c580, 0x2c, 0x2c}, {0x40008e1d08?, 0x125774?, 0x5374f20?})
	/usr/local/go/src/testing/testing.go:2475 +0x3b8
testing.(*M).Run(0x4000797ae0)
	/usr/local/go/src/testing/testing.go:2337 +0x530
k8s.io/minikube/test/integration.TestMain(0x4000797ae0)
	/home/jenkins/workspace/Build_Cross/test/integration/main_test.go:64 +0xf0
main.main()
	_testmain.go:133 +0x88

                                                
                                                
goroutine 179 [sync.Cond.Wait, 3 minutes]:
sync.runtime_notifyListWait(0x400071d750, 0x2d)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x400071d740)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3701da0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x4000b6be00)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x4000085110?, 0x2475740?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e5b30?, 0x4000084460?}, 0x53771e0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e5b30, 0x4000084460}, 0x40000daf38, {0x369d6a0, 0x40019c8030}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x40013d4fa8?, {0x369d6a0?, 0x40019c8030?}, 0x80?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x4001984030, 0x3b9aca00, 0x0, 0x1, 0x4000084460)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 166
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 4731 [chan receive, 15 minutes]:
testing.(*T).Run(0x400141b340, {0x296e9ac?, 0x0?}, 0x40004e1d00)
	/usr/local/go/src/testing/testing.go:2005 +0x378
k8s.io/minikube/test/integration.TestStartStop.func1.1(0x400141b340)
	/home/jenkins/workspace/Build_Cross/test/integration/start_stop_delete_test.go:128 +0x7e4
testing.tRunner(0x400141b340, 0x400023da80)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 4729
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 165 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36fe8a0, {{0x36f3450, 0x400023c080?}, 0x4001a80700?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 164
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 884 [IO wait, 110 minutes]:
internal/poll.runtime_pollWait(0xffff7053fa00, 0x72)
	/usr/local/go/src/runtime/netpoll.go:351 +0xa0
internal/poll.(*pollDesc).wait(0x4001a56380?, 0x2d970?, 0x0)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x28
internal/poll.(*pollDesc).waitRead(...)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Accept(0x4001a56380)
	/usr/local/go/src/internal/poll/fd_unix.go:613 +0x21c
net.(*netFD).accept(0x4001a56380)
	/usr/local/go/src/net/fd_unix.go:161 +0x28
net.(*TCPListener).accept(0x40006eca80)
	/usr/local/go/src/net/tcpsock_posix.go:159 +0x24
net.(*TCPListener).Accept(0x40006eca80)
	/usr/local/go/src/net/tcpsock.go:380 +0x2c
net/http.(*Server).Serve(0x40002da300, {0x36d3140, 0x40006eca80})
	/usr/local/go/src/net/http/server.go:3463 +0x24c
net/http.(*Server).ListenAndServe(0x40002da300)
	/usr/local/go/src/net/http/server.go:3389 +0x80
k8s.io/minikube/test/integration.startHTTPProxy.func1(...)
	/home/jenkins/workspace/Build_Cross/test/integration/functional_test.go:2218
created by k8s.io/minikube/test/integration.startHTTPProxy in goroutine 882
	/home/jenkins/workspace/Build_Cross/test/integration/functional_test.go:2217 +0x104

                                                
                                                
goroutine 4729 [chan receive, 20 minutes]:
testing.tRunner.func1()
	/usr/local/go/src/testing/testing.go:1891 +0x3d0
testing.tRunner(0x400141ac40, 0x339b718)
	/usr/local/go/src/testing/testing.go:1940 +0x104
created by testing.(*T).Run in goroutine 4545
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 181 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 180
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 180 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e5b30, 0x4000084460}, 0x40008f2f40, 0x40008f2f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e5b30, 0x4000084460}, 0x0?, 0x40008f2f40, 0x40008f2f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e5b30?, 0x4000084460?}, 0x0?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x0?, 0x95c64?, 0x4000048480?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 166
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 4861 [chan receive, 28 minutes]:
testing.(*testState).waitParallel(0x40006ee960)
	/usr/local/go/src/testing/testing.go:2116 +0x158
testing.(*T).Parallel(0x4001509dc0)
	/usr/local/go/src/testing/testing.go:1709 +0x19c
k8s.io/minikube/test/integration.MaybeParallel(0x4001509dc0)
	/home/jenkins/workspace/Build_Cross/test/integration/helpers_test.go:500 +0x5c
k8s.io/minikube/test/integration.TestNetworkPlugins.func1.1(0x4001509dc0)
	/home/jenkins/workspace/Build_Cross/test/integration/net_test.go:106 +0x2c0
testing.tRunner(0x4001509dc0, 0x4001e6a580)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 4748
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 1145 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 1144
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 166 [chan receive, 117 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x4000b6be00, 0x4000084460)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 164
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 5038 [chan receive, 22 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x4001410c00, 0x4000084460)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 5033
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 5349 [sync.Cond.Wait]:
sync.runtime_notifyListWait(0x400152e410, 0x13)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x400152e400)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3701da0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x4001410d80)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x40015a6620?, 0x1618bc?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e5b30?, 0x4000084460?}, 0x40013d46a8?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e5b30, 0x4000084460}, 0x400151df38, {0x369d6a0, 0x40018f2930}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x40013d47a8?, {0x369d6a0?, 0x40018f2930?}, 0xb0?, 0x4000049500?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x40015e8a70, 0x3b9aca00, 0x0, 0x1, 0x4000084460)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 5346
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 4459 [chan receive, 28 minutes]:
testing.(*T).Run(0x400141a8c0, {0x296d53a?, 0x1cbfc6c6fff9?}, 0x4001944390)
	/usr/local/go/src/testing/testing.go:2005 +0x378
k8s.io/minikube/test/integration.TestNetworkPlugins(0x400141a8c0)
	/home/jenkins/workspace/Build_Cross/test/integration/net_test.go:52 +0xe4
testing.tRunner(0x400141a8c0, 0x339b4e8)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 1
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 5492 [chan receive, 3 minutes]:
testing.(*T).Run(0x4001795180, {0x297a61d?, 0x40000006ee?}, 0x40004e1800)
	/usr/local/go/src/testing/testing.go:2005 +0x378
k8s.io/minikube/test/integration.TestStartStop.func1.1.1(0x4001795180)
	/home/jenkins/workspace/Build_Cross/test/integration/start_stop_delete_test.go:153 +0x1b8
testing.tRunner(0x4001795180, 0x40004e1d00)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 4731
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 1144 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e5b30, 0x4000084460}, 0x40013c4740, 0x40015f4f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e5b30, 0x4000084460}, 0xa7?, 0x40013c4740, 0x40013c4788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e5b30?, 0x4000084460?}, 0x0?, 0x40013c4750?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x36f3450?, 0x400023c080?, 0x3136363536382e39?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 1140
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 1871 [chan send, 101 minutes]:
os/exec.(*Cmd).watchCtx(0x4001393c80, 0x40015a6d90)
	/usr/local/go/src/os/exec/exec.go:814 +0x280
created by os/exec.(*Cmd).Start in goroutine 1870
	/usr/local/go/src/os/exec/exec.go:775 +0x678

                                                
                                                
goroutine 1143 [sync.Cond.Wait, 3 minutes]:
sync.runtime_notifyListWait(0x400077f910, 0x2a)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x400077f900)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3701da0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x40018f4cc0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x4001431110?, 0x1618bc?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e5b30?, 0x4000084460?}, 0x40000a06a8?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e5b30, 0x4000084460}, 0x4001501f38, {0x369d6a0, 0x4001adadb0}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x40000a07a8?, {0x369d6a0?, 0x4001adadb0?}, 0xd0?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x40016a5140, 0x3b9aca00, 0x0, 0x1, 0x4000084460)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 1140
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 5345 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36fe8a0, {{0x36f3450, 0x400023c080?}, 0x4001586480?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 5325
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 2909 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36fe8a0, {{0x36f3450, 0x400023c080?}, 0x1?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 2784
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 1946 [chan send, 101 minutes]:
os/exec.(*Cmd).watchCtx(0x4001513200, 0x4001430850)
	/usr/local/go/src/os/exec/exec.go:814 +0x280
created by os/exec.(*Cmd).Start in goroutine 1945
	/usr/local/go/src/os/exec/exec.go:775 +0x678

                                                
                                                
goroutine 4748 [chan receive, 28 minutes]:
testing.tRunner.func1()
	/usr/local/go/src/testing/testing.go:1891 +0x3d0
testing.tRunner(0x4001571a40, 0x4001944390)
	/usr/local/go/src/testing/testing.go:1940 +0x104
created by testing.(*T).Run in goroutine 4459
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 5351 [select]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 5350
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 1978 [chan send, 97 minutes]:
os/exec.(*Cmd).watchCtx(0x400174e000, 0x40016eb730)
	/usr/local/go/src/os/exec/exec.go:814 +0x280
created by os/exec.(*Cmd).Start in goroutine 1015
	/usr/local/go/src/os/exec/exec.go:775 +0x678

                                                
                                                
goroutine 3251 [chan send, 63 minutes]:
os/exec.(*Cmd).watchCtx(0x4000048c00, 0x40019ded20)
	/usr/local/go/src/os/exec/exec.go:814 +0x280
created by os/exec.(*Cmd).Start in goroutine 3250
	/usr/local/go/src/os/exec/exec.go:775 +0x678

                                                
                                                
goroutine 1140 [chan receive, 108 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x40018f4cc0, 0x4000084460)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 1132
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 1139 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36fe8a0, {{0x36f3450, 0x400023c080?}, 0x3136363536382e39?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 1132
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 4936 [chan receive, 28 minutes]:
testing.(*testState).waitParallel(0x40006ee960)
	/usr/local/go/src/testing/testing.go:2116 +0x158
testing.(*T).Parallel(0x4000c6d6c0)
	/usr/local/go/src/testing/testing.go:1709 +0x19c
k8s.io/minikube/test/integration.MaybeParallel(0x4000c6d6c0)
	/home/jenkins/workspace/Build_Cross/test/integration/helpers_test.go:500 +0x5c
k8s.io/minikube/test/integration.TestNetworkPlugins.func1.1(0x4000c6d6c0)
	/home/jenkins/workspace/Build_Cross/test/integration/net_test.go:106 +0x2c0
testing.tRunner(0x4000c6d6c0, 0x40001db500)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 4748
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 4749 [chan receive, 28 minutes]:
testing.(*testState).waitParallel(0x40006ee960)
	/usr/local/go/src/testing/testing.go:2116 +0x158
testing.(*T).Parallel(0x4001571c00)
	/usr/local/go/src/testing/testing.go:1709 +0x19c
k8s.io/minikube/test/integration.MaybeParallel(0x4001571c00)
	/home/jenkins/workspace/Build_Cross/test/integration/helpers_test.go:500 +0x5c
k8s.io/minikube/test/integration.TestNetworkPlugins.func1.1(0x4001571c00)
	/home/jenkins/workspace/Build_Cross/test/integration/net_test.go:106 +0x2c0
testing.tRunner(0x4001571c00, 0x4001e6a080)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 4748
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 5162 [chan receive, 3 minutes]:
testing.(*T).Run(0x4001871340, {0x2999f88?, 0x40000006ee?}, 0x40004e0680)
	/usr/local/go/src/testing/testing.go:2005 +0x378
k8s.io/minikube/test/integration.TestStartStop.func1.1.1(0x4001871340)
	/home/jenkins/workspace/Build_Cross/test/integration/start_stop_delete_test.go:153 +0x1b8
testing.tRunner(0x4001871340, 0x4001e6a800)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 4733
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 5590 [select]:
k8s.io/apimachinery/pkg/util/wait.loopConditionUntilContext({0x36e5798, 0x40002e7d50}, {0x36d37a0, 0x4001885ee0}, 0x1, 0x0, 0x4000c73be0)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/loop.go:66 +0x158
k8s.io/apimachinery/pkg/util/wait.PollUntilContextTimeout({0x36e5798?, 0x40002fb110?}, 0x3b9aca00, 0x4000c73e08?, 0x1, 0x4000c73be0)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:48 +0x8c
k8s.io/minikube/test/integration.PodWait({0x36e5798, 0x40002fb110}, 0x40015a2000, {0x40019126c0, 0x11}, {0x2993f7b, 0x14}, {0x29abe42, 0x1c}, 0x7dba821800)
	/home/jenkins/workspace/Build_Cross/test/integration/helpers_test.go:379 +0x22c
k8s.io/minikube/test/integration.validateAppExistsAfterStop({0x36e5798, 0x40002fb110}, 0x40015a2000, {0x40019126c0, 0x11}, {0x29784e6?, 0x22140bc900161e84?}, {0x692435ef?, 0x4001502f58?}, {0x161f08?, ...})
	/home/jenkins/workspace/Build_Cross/test/integration/start_stop_delete_test.go:272 +0xf8
k8s.io/minikube/test/integration.TestStartStop.func1.1.1.1(0x40015a2000?)
	/home/jenkins/workspace/Build_Cross/test/integration/start_stop_delete_test.go:154 +0x44
testing.tRunner(0x40015a2000, 0x40004e0680)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 5162
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 5350 [select]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e5b30, 0x4000084460}, 0x40013c9f40, 0x4001389f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e5b30, 0x4000084460}, 0x30?, 0x40013c9f40, 0x40013c9f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e5b30?, 0x4000084460?}, 0x40015b0d80?, 0x4001585400?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x0?, 0x95c64?, 0x400150ef00?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 5346
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 4545 [chan receive, 30 minutes]:
testing.(*T).Run(0x400141afc0, {0x296d53a?, 0x40000d8f58?}, 0x339b718)
	/usr/local/go/src/testing/testing.go:2005 +0x378
k8s.io/minikube/test/integration.TestStartStop(0x400141afc0)
	/home/jenkins/workspace/Build_Cross/test/integration/start_stop_delete_test.go:46 +0x3c
testing.tRunner(0x400141afc0, 0x339b530)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 1
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 3355 [chan send, 63 minutes]:
os/exec.(*Cmd).watchCtx(0x4000048780, 0x4001592850)
	/usr/local/go/src/os/exec/exec.go:814 +0x280
created by os/exec.(*Cmd).Start in goroutine 2775
	/usr/local/go/src/os/exec/exec.go:775 +0x678

                                                
                                                
goroutine 5043 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 5042
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 5604 [sync.Cond.Wait, 3 minutes]:
sync.runtime_notifyListWait(0x400068fed0, 0x0)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x400068fec0)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3701da0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x40019e09c0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x40013c6f18?, 0x1618bc?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e5b30?, 0x4000084460?}, 0x40013c6ea8?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e5b30, 0x4000084460}, 0x40008ecf38, {0x369d6a0, 0x4001b0a0c0}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x36f3450?, {0x369d6a0?, 0x4001b0a0c0?}, 0xa0?, 0x40002bfab0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x40018d0020, 0x3b9aca00, 0x0, 0x1, 0x4000084460)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 5592
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 2612 [IO wait, 92 minutes]:
internal/poll.runtime_pollWait(0xffff70289e00, 0x72)
	/usr/local/go/src/runtime/netpoll.go:351 +0xa0
internal/poll.(*pollDesc).wait(0x4001a56000?, 0x2d970?, 0x0)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x28
internal/poll.(*pollDesc).waitRead(...)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Accept(0x4001a56000)
	/usr/local/go/src/internal/poll/fd_unix.go:613 +0x21c
net.(*netFD).accept(0x4001a56000)
	/usr/local/go/src/net/fd_unix.go:161 +0x28
net.(*TCPListener).accept(0x400193c300)
	/usr/local/go/src/net/tcpsock_posix.go:159 +0x24
net.(*TCPListener).Accept(0x400193c300)
	/usr/local/go/src/net/tcpsock.go:380 +0x2c
net/http.(*Server).Serve(0x40002da500, {0x36d3140, 0x400193c300})
	/usr/local/go/src/net/http/server.go:3463 +0x24c
net/http.(*Server).ListenAndServe(0x40002da500)
	/usr/local/go/src/net/http/server.go:3389 +0x80
k8s.io/minikube/test/integration.startHTTPProxy.func1(...)
	/home/jenkins/workspace/Build_Cross/test/integration/functional_test.go:2218
created by k8s.io/minikube/test/integration.startHTTPProxy in goroutine 2610
	/home/jenkins/workspace/Build_Cross/test/integration/functional_test.go:2217 +0x104

                                                
                                                
goroutine 5566 [syscall, 3 minutes]:
syscall.Syscall6(0x5f, 0x3, 0x14, 0x4001388b18, 0x4, 0x40014f23f0, 0x0)
	/usr/local/go/src/syscall/syscall_linux.go:96 +0x2c
internal/syscall/unix.Waitid(0x4001388c78?, 0x1929a0?, 0xffffde6b0145?, 0x0?, 0x400140a270?)
	/usr/local/go/src/internal/syscall/unix/waitid_linux.go:18 +0x44
os.(*Process).pidfdWait.func1(...)
	/usr/local/go/src/os/pidfd_linux.go:109
os.ignoringEINTR(...)
	/usr/local/go/src/os/file_posix.go:256
os.(*Process).pidfdWait(0x400068e280)
	/usr/local/go/src/os/pidfd_linux.go:108 +0x144
os.(*Process).wait(0x4001388c48?)
	/usr/local/go/src/os/exec_unix.go:25 +0x24
os.(*Process).Wait(...)
	/usr/local/go/src/os/exec.go:340
os/exec.(*Cmd).Wait(0x40015a0300)
	/usr/local/go/src/os/exec/exec.go:922 +0x38
os/exec.(*Cmd).Run(0x40015a0300)
	/usr/local/go/src/os/exec/exec.go:626 +0x38
k8s.io/minikube/test/integration.Run(0x40015a21c0, 0x40015a0300)
	/home/jenkins/workspace/Build_Cross/test/integration/helpers_test.go:103 +0x154
k8s.io/minikube/test/integration.validateSecondStart({0x36e5798, 0x40002ac460}, 0x40015a21c0, {0x4001913908, 0x11}, {0x8f4c72d?, 0x8f4c72d00161e84?}, {0x692435dc?, 0x4001388f58?}, {0x40002db500?, ...})
	/home/jenkins/workspace/Build_Cross/test/integration/start_stop_delete_test.go:254 +0x90
k8s.io/minikube/test/integration.TestStartStop.func1.1.1.1(0x40015a21c0?)
	/home/jenkins/workspace/Build_Cross/test/integration/start_stop_delete_test.go:154 +0x44
testing.tRunner(0x40015a21c0, 0x40004e1800)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 5492
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 4733 [chan receive, 20 minutes]:
testing.(*T).Run(0x400141b6c0, {0x296e9ac?, 0x0?}, 0x4001e6a800)
	/usr/local/go/src/testing/testing.go:2005 +0x378
k8s.io/minikube/test/integration.TestStartStop.func1.1(0x400141b6c0)
	/home/jenkins/workspace/Build_Cross/test/integration/start_stop_delete_test.go:128 +0x7e4
testing.tRunner(0x400141b6c0, 0x400023db00)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 4729
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 5346 [chan receive, 15 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x4001410d80, 0x4000084460)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 5325
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 5592 [chan receive, 3 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x40019e09c0, 0x4000084460)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 5590
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 5037 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36fe8a0, {{0x36f3450, 0x400023c080?}, 0x4001509a40?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 5033
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 4935 [chan receive, 28 minutes]:
testing.(*testState).waitParallel(0x40006ee960)
	/usr/local/go/src/testing/testing.go:2116 +0x158
testing.(*T).Parallel(0x40014f8540)
	/usr/local/go/src/testing/testing.go:1709 +0x19c
k8s.io/minikube/test/integration.MaybeParallel(0x40014f8540)
	/home/jenkins/workspace/Build_Cross/test/integration/helpers_test.go:500 +0x5c
k8s.io/minikube/test/integration.TestNetworkPlugins.func1.1(0x40014f8540)
	/home/jenkins/workspace/Build_Cross/test/integration/net_test.go:106 +0x2c0
testing.tRunner(0x40014f8540, 0x40001db480)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 4748
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 5041 [sync.Cond.Wait]:
sync.runtime_notifyListWait(0x400071dc10, 0x15)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x400071dc00)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3701da0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x4001410c00)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x40019de9a0?, 0xc?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e5b30?, 0x4000084460?}, 0x6ee?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e5b30, 0x4000084460}, 0x40008f1f38, {0x369d6a0, 0x40017b61b0}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x40013d2fa8?, {0x369d6a0?, 0x40017b61b0?}, 0x90?, 0x161f90?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x4000750360, 0x3b9aca00, 0x0, 0x1, 0x4000084460)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 5038
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 4860 [chan receive, 28 minutes]:
testing.(*testState).waitParallel(0x40006ee960)
	/usr/local/go/src/testing/testing.go:2116 +0x158
testing.(*T).Parallel(0x4001509a40)
	/usr/local/go/src/testing/testing.go:1709 +0x19c
k8s.io/minikube/test/integration.MaybeParallel(0x4001509a40)
	/home/jenkins/workspace/Build_Cross/test/integration/helpers_test.go:500 +0x5c
k8s.io/minikube/test/integration.TestNetworkPlugins.func1.1(0x4001509a40)
	/home/jenkins/workspace/Build_Cross/test/integration/net_test.go:106 +0x2c0
testing.tRunner(0x4001509a40, 0x4001e6a500)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 4748
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 2910 [chan receive, 64 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x40008d96e0, 0x4000084460)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 2784
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 5591 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36fe8a0, {{0x36f3450, 0x400023c080?}, 0x40015a2000?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 5590
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 5042 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e5b30, 0x4000084460}, 0x40000d5f40, 0x40000d5f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e5b30, 0x4000084460}, 0xe0?, 0x40000d5f40, 0x40000d5f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e5b30?, 0x4000084460?}, 0x400068a600?, 0x4000794c80?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x0?, 0x95c64?, 0x400068ad80?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 5038
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 5606 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 5605
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 5567 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xffff7053f600, 0x72)
	/usr/local/go/src/runtime/netpoll.go:351 +0xa0
internal/poll.(*pollDesc).wait(0x4000b6b860?, 0x40015a9368?, 0x1)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x28
internal/poll.(*pollDesc).waitRead(...)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0x4000b6b860, {0x40015a9368, 0x498, 0x498})
	/usr/local/go/src/internal/poll/fd_unix.go:165 +0x1e0
os.(*File).read(...)
	/usr/local/go/src/os/file_posix.go:29
os.(*File).Read(0x40001400f0, {0x40015a9368?, 0x40013c3d48?, 0xcc7cc?})
	/usr/local/go/src/os/file.go:144 +0x68
bytes.(*Buffer).ReadFrom(0x40015d6570, {0x369ba78, 0x40018fc120})
	/usr/local/go/src/bytes/buffer.go:217 +0x90
io.copyBuffer({0x369bc60, 0x40015d6570}, {0x369ba78, 0x40018fc120}, {0x0, 0x0, 0x0})
	/usr/local/go/src/io/io.go:415 +0x14c
io.Copy(...)
	/usr/local/go/src/io/io.go:388
os.genericWriteTo(0x40001400f0?, {0x369bc60, 0x40015d6570})
	/usr/local/go/src/os/file.go:295 +0x58
os.(*File).WriteTo(0x40001400f0, {0x369bc60, 0x40015d6570})
	/usr/local/go/src/os/file.go:273 +0x9c
io.copyBuffer({0x369bc60, 0x40015d6570}, {0x369baf8, 0x40001400f0}, {0x0, 0x0, 0x0})
	/usr/local/go/src/io/io.go:411 +0x98
io.Copy(...)
	/usr/local/go/src/io/io.go:388
os/exec.(*Cmd).writerDescriptor.func1()
	/usr/local/go/src/os/exec/exec.go:596 +0x40
os/exec.(*Cmd).Start.func2(0x40015a21c0?)
	/usr/local/go/src/os/exec/exec.go:749 +0x30
created by os/exec.(*Cmd).Start in goroutine 5566
	/usr/local/go/src/os/exec/exec.go:748 +0x6a4

                                                
                                                
goroutine 2962 [select, 5 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e5b30, 0x4000084460}, 0x40013c9f40, 0x40015f7f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e5b30, 0x4000084460}, 0x98?, 0x40013c9f40, 0x40013c9f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e5b30?, 0x4000084460?}, 0x40015b0d80?, 0x4001585400?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x0?, 0x95c64?, 0x4001586f00?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 2910
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 4859 [chan receive, 28 minutes]:
testing.(*testState).waitParallel(0x40006ee960)
	/usr/local/go/src/testing/testing.go:2116 +0x158
testing.(*T).Parallel(0x4001509500)
	/usr/local/go/src/testing/testing.go:1709 +0x19c
k8s.io/minikube/test/integration.MaybeParallel(0x4001509500)
	/home/jenkins/workspace/Build_Cross/test/integration/helpers_test.go:500 +0x5c
k8s.io/minikube/test/integration.TestNetworkPlugins.func1.1(0x4001509500)
	/home/jenkins/workspace/Build_Cross/test/integration/net_test.go:106 +0x2c0
testing.tRunner(0x4001509500, 0x4001e6a480)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 4748
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 5585 [select, 3 minutes]:
os/exec.(*Cmd).watchCtx(0x40015a0300, 0x40019de690)
	/usr/local/go/src/os/exec/exec.go:789 +0x70
created by os/exec.(*Cmd).Start in goroutine 5566
	/usr/local/go/src/os/exec/exec.go:775 +0x678

                                                
                                                
goroutine 2963 [select, 5 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 2962
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 2961 [sync.Cond.Wait, 5 minutes]:
sync.runtime_notifyListWait(0x40003d4d90, 0x1f)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x40003d4d80)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3701da0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x40008d96e0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x4001592a80?, 0x1618bc?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e5b30?, 0x4000084460?}, 0x40013cfea8?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e5b30, 0x4000084460}, 0x400138af38, {0x369d6a0, 0x4001abf710}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x40013cffa8?, {0x369d6a0?, 0x4001abf710?}, 0x10?, 0x161f90?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x4001ad4880, 0x3b9aca00, 0x0, 0x1, 0x4000084460)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 2910
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 5568 [IO wait]:
internal/poll.runtime_pollWait(0xffff7053f200, 0x72)
	/usr/local/go/src/runtime/netpoll.go:351 +0xa0
internal/poll.(*pollDesc).wait(0x4000b6b920?, 0x4001c975c9?, 0x1)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x28
internal/poll.(*pollDesc).waitRead(...)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0x4000b6b920, {0x4001c975c9, 0x66a37, 0x66a37})
	/usr/local/go/src/internal/poll/fd_unix.go:165 +0x1e0
os.(*File).read(...)
	/usr/local/go/src/os/file_posix.go:29
os.(*File).Read(0x4000140118, {0x4001c975c9?, 0x400135cd48?, 0xcc7cc?})
	/usr/local/go/src/os/file.go:144 +0x68
bytes.(*Buffer).ReadFrom(0x40015d65a0, {0x369ba78, 0x40018fc128})
	/usr/local/go/src/bytes/buffer.go:217 +0x90
io.copyBuffer({0x369bc60, 0x40015d65a0}, {0x369ba78, 0x40018fc128}, {0x0, 0x0, 0x0})
	/usr/local/go/src/io/io.go:415 +0x14c
io.Copy(...)
	/usr/local/go/src/io/io.go:388
os.genericWriteTo(0x4000140118?, {0x369bc60, 0x40015d65a0})
	/usr/local/go/src/os/file.go:295 +0x58
os.(*File).WriteTo(0x4000140118, {0x369bc60, 0x40015d65a0})
	/usr/local/go/src/os/file.go:273 +0x9c
io.copyBuffer({0x369bc60, 0x40015d65a0}, {0x369baf8, 0x4000140118}, {0x0, 0x0, 0x0})
	/usr/local/go/src/io/io.go:411 +0x98
io.Copy(...)
	/usr/local/go/src/io/io.go:388
os/exec.(*Cmd).writerDescriptor.func1()
	/usr/local/go/src/os/exec/exec.go:596 +0x40
os/exec.(*Cmd).Start.func2(0x40015a0000?)
	/usr/local/go/src/os/exec/exec.go:749 +0x30
created by os/exec.(*Cmd).Start in goroutine 5566
	/usr/local/go/src/os/exec/exec.go:748 +0x6a4

                                                
                                                
goroutine 3272 [chan send, 63 minutes]:
os/exec.(*Cmd).watchCtx(0x4000049e00, 0x40019dfd50)
	/usr/local/go/src/os/exec/exec.go:814 +0x280
created by os/exec.(*Cmd).Start in goroutine 3271
	/usr/local/go/src/os/exec/exec.go:775 +0x678

                                                
                                                
goroutine 5605 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e5b30, 0x4000084460}, 0x400151bf40, 0x400151bf88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e5b30, 0x4000084460}, 0x28?, 0x400151bf40, 0x400151bf88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e5b30?, 0x4000084460?}, 0x400068ad80?, 0x40002b9cc0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x0?, 0x95c64?, 0x4000048780?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 5592
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 4858 [chan receive, 28 minutes]:
testing.(*testState).waitParallel(0x40006ee960)
	/usr/local/go/src/testing/testing.go:2116 +0x158
testing.(*T).Parallel(0x4001508540)
	/usr/local/go/src/testing/testing.go:1709 +0x19c
k8s.io/minikube/test/integration.MaybeParallel(0x4001508540)
	/home/jenkins/workspace/Build_Cross/test/integration/helpers_test.go:500 +0x5c
k8s.io/minikube/test/integration.TestNetworkPlugins.func1.1(0x4001508540)
	/home/jenkins/workspace/Build_Cross/test/integration/net_test.go:106 +0x2c0
testing.tRunner(0x4001508540, 0x4001e6a400)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 4748
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                    

Test pass (254/320)

Order passed test Duration
3 TestDownloadOnly/v1.28.0/json-events 6.3
4 TestDownloadOnly/v1.28.0/preload-exists 0
8 TestDownloadOnly/v1.28.0/LogsDuration 0.1
9 TestDownloadOnly/v1.28.0/DeleteAll 0.22
10 TestDownloadOnly/v1.28.0/DeleteAlwaysSucceeds 0.14
12 TestDownloadOnly/v1.34.2/json-events 7.26
13 TestDownloadOnly/v1.34.2/preload-exists 0
17 TestDownloadOnly/v1.34.2/LogsDuration 0.09
18 TestDownloadOnly/v1.34.2/DeleteAll 0.21
19 TestDownloadOnly/v1.34.2/DeleteAlwaysSucceeds 0.13
21 TestDownloadOnly/v1.35.0-beta.0/json-events 4.42
23 TestDownloadOnly/v1.35.0-beta.0/cached-images 0.44
24 TestDownloadOnly/v1.35.0-beta.0/binaries 0
26 TestDownloadOnly/v1.35.0-beta.0/LogsDuration 0.08
27 TestDownloadOnly/v1.35.0-beta.0/DeleteAll 0.21
28 TestDownloadOnly/v1.35.0-beta.0/DeleteAlwaysSucceeds 0.14
30 TestBinaryMirror 0.62
34 TestAddons/PreSetup/EnablingAddonOnNonExistingCluster 0.08
35 TestAddons/PreSetup/DisablingAddonOnNonExistingCluster 0.08
36 TestAddons/Setup 157.44
38 TestAddons/serial/Volcano 41.86
40 TestAddons/serial/GCPAuth/Namespaces 0.18
41 TestAddons/serial/GCPAuth/FakeCredentials 8.83
44 TestAddons/parallel/Registry 16.22
45 TestAddons/parallel/RegistryCreds 0.76
46 TestAddons/parallel/Ingress 20.54
47 TestAddons/parallel/InspektorGadget 11.07
48 TestAddons/parallel/MetricsServer 6.81
50 TestAddons/parallel/CSI 36.08
51 TestAddons/parallel/Headlamp 17.11
52 TestAddons/parallel/CloudSpanner 5.66
54 TestAddons/parallel/NvidiaDevicePlugin 5.55
55 TestAddons/parallel/Yakd 10.87
57 TestAddons/StoppedEnableDisable 12.63
58 TestCertOptions 38.37
59 TestCertExpiration 220.98
61 TestForceSystemdFlag 40.59
62 TestForceSystemdEnv 37.61
67 TestErrorSpam/setup 32.87
68 TestErrorSpam/start 0.83
69 TestErrorSpam/status 1.08
70 TestErrorSpam/pause 1.81
71 TestErrorSpam/unpause 1.89
72 TestErrorSpam/stop 1.64
75 TestFunctional/serial/CopySyncFile 0.01
76 TestFunctional/serial/StartWithProxy 79.06
77 TestFunctional/serial/AuditLog 0
78 TestFunctional/serial/SoftStart 8.3
79 TestFunctional/serial/KubeContext 0.06
80 TestFunctional/serial/KubectlGetPods 0.11
83 TestFunctional/serial/CacheCmd/cache/add_remote 3.45
84 TestFunctional/serial/CacheCmd/cache/add_local 1.3
85 TestFunctional/serial/CacheCmd/cache/CacheDelete 0.06
86 TestFunctional/serial/CacheCmd/cache/list 0.06
87 TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node 0.34
88 TestFunctional/serial/CacheCmd/cache/cache_reload 1.87
89 TestFunctional/serial/CacheCmd/cache/delete 0.13
90 TestFunctional/serial/MinikubeKubectlCmd 0.14
91 TestFunctional/serial/MinikubeKubectlCmdDirectly 0.14
92 TestFunctional/serial/ExtraConfig 54.57
93 TestFunctional/serial/ComponentHealth 0.11
94 TestFunctional/serial/LogsCmd 1.49
95 TestFunctional/serial/LogsFileCmd 1.53
96 TestFunctional/serial/InvalidService 4.5
98 TestFunctional/parallel/ConfigCmd 0.51
100 TestFunctional/parallel/DryRun 0.47
101 TestFunctional/parallel/InternationalLanguage 0.21
102 TestFunctional/parallel/StatusCmd 1.07
107 TestFunctional/parallel/AddonsCmd 0.15
110 TestFunctional/parallel/SSHCmd 0.59
111 TestFunctional/parallel/CpCmd 2.1
113 TestFunctional/parallel/FileSync 0.36
114 TestFunctional/parallel/CertSync 2.26
118 TestFunctional/parallel/NodeLabels 0.11
120 TestFunctional/parallel/NonActiveRuntimeDisabled 0.74
122 TestFunctional/parallel/License 0.36
123 TestFunctional/parallel/Version/short 0.06
124 TestFunctional/parallel/Version/components 1.19
125 TestFunctional/parallel/ImageCommands/ImageListShort 0.22
126 TestFunctional/parallel/ImageCommands/ImageListTable 0.23
127 TestFunctional/parallel/ImageCommands/ImageListJson 0.23
128 TestFunctional/parallel/ImageCommands/ImageListYaml 0.23
129 TestFunctional/parallel/ImageCommands/ImageBuild 3.49
130 TestFunctional/parallel/ImageCommands/Setup 0.69
131 TestFunctional/parallel/UpdateContextCmd/no_changes 0.15
132 TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster 0.15
133 TestFunctional/parallel/UpdateContextCmd/no_clusters 0.15
134 TestFunctional/parallel/ImageCommands/ImageLoadDaemon 1.44
135 TestFunctional/parallel/ImageCommands/ImageReloadDaemon 1.42
136 TestFunctional/parallel/ServiceCmd/DeployApp 8.28
137 TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon 1.35
138 TestFunctional/parallel/ImageCommands/ImageSaveToFile 0.34
139 TestFunctional/parallel/ImageCommands/ImageRemove 0.49
140 TestFunctional/parallel/ImageCommands/ImageLoadFromFile 0.65
141 TestFunctional/parallel/ImageCommands/ImageSaveDaemon 0.39
143 TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel 0.52
144 TestFunctional/parallel/TunnelCmd/serial/StartTunnel 0
147 TestFunctional/parallel/ServiceCmd/List 0.37
148 TestFunctional/parallel/ServiceCmd/JSONOutput 0.35
149 TestFunctional/parallel/ServiceCmd/HTTPS 0.37
150 TestFunctional/parallel/ServiceCmd/Format 0.38
151 TestFunctional/parallel/ServiceCmd/URL 0.41
156 TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel 0.11
157 TestFunctional/parallel/ProfileCmd/profile_not_create 0.46
158 TestFunctional/parallel/ProfileCmd/profile_list 0.45
159 TestFunctional/parallel/ProfileCmd/profile_json_output 0.43
160 TestFunctional/parallel/MountCmd/any-port 7.39
161 TestFunctional/parallel/MountCmd/specific-port 1.99
162 TestFunctional/parallel/MountCmd/VerifyCleanup 1.3
163 TestFunctional/delete_echo-server_images 0.04
164 TestFunctional/delete_my-image_image 0.02
165 TestFunctional/delete_minikube_cached_images 0.02
169 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CopySyncFile 0
171 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/AuditLog 0
173 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/KubeContext 0.06
177 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/add_remote 3.31
178 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/add_local 1.06
179 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/CacheDelete 0.06
180 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/list 0.06
181 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/verify_cache_inside_node 0.32
182 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/cache_reload 1.85
183 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/delete 0.12
188 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/LogsCmd 0.97
189 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/LogsFileCmd 1.03
192 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ConfigCmd 0.45
194 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DryRun 0.44
195 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/InternationalLanguage 0.22
201 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/AddonsCmd 0.16
204 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/SSHCmd 0.75
205 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/CpCmd 2.17
207 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/FileSync 0.35
208 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/CertSync 2.14
214 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NonActiveRuntimeDisabled 0.73
216 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/License 0.35
217 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/Version/short 0.07
218 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/Version/components 0.49
219 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListShort 0.53
220 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListTable 0.23
221 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListJson 0.24
222 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListYaml 0.3
223 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageBuild 3.58
224 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/Setup 0.25
225 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageLoadDaemon 1.42
226 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageReloadDaemon 1.3
227 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageTagAndLoadDaemon 1.74
228 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_changes 0.15
229 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_minikube_cluster 0.15
230 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_clusters 0.16
231 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageSaveToFile 0.47
232 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageRemove 0.54
233 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageLoadFromFile 0.9
234 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageSaveDaemon 0.47
237 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/StartTunnel 0
244 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/DeleteTunnel 0.1
251 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ProfileCmd/profile_not_create 0.44
252 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ProfileCmd/profile_list 0.4
253 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ProfileCmd/profile_json_output 0.44
255 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MountCmd/specific-port 1.94
256 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MountCmd/VerifyCleanup 1.9
257 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/delete_echo-server_images 0.04
258 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/delete_my-image_image 0.02
259 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/delete_minikube_cached_images 0.02
263 TestMultiControlPlane/serial/StartCluster 170.6
264 TestMultiControlPlane/serial/DeployApp 8.2
265 TestMultiControlPlane/serial/PingHostFromPods 1.61
266 TestMultiControlPlane/serial/AddWorkerNode 60.54
267 TestMultiControlPlane/serial/NodeLabels 0.11
268 TestMultiControlPlane/serial/HAppyAfterClusterStart 1.11
269 TestMultiControlPlane/serial/CopyFile 20.51
270 TestMultiControlPlane/serial/StopSecondaryNode 13
271 TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop 0.85
272 TestMultiControlPlane/serial/RestartSecondaryNode 15.34
273 TestMultiControlPlane/serial/HAppyAfterSecondaryNodeRestart 1.05
274 TestMultiControlPlane/serial/RestartClusterKeepsNodes 94.49
275 TestMultiControlPlane/serial/DeleteSecondaryNode 11.15
276 TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete 0.84
277 TestMultiControlPlane/serial/StopCluster 25.75
278 TestMultiControlPlane/serial/RestartCluster 61.29
279 TestMultiControlPlane/serial/DegradedAfterClusterRestart 0.84
280 TestMultiControlPlane/serial/AddSecondaryNode 83.83
281 TestMultiControlPlane/serial/HAppyAfterSecondaryNodeAdd 1.12
286 TestJSONOutput/start/Command 83.65
287 TestJSONOutput/start/Audit 0
289 TestJSONOutput/start/parallel/DistinctCurrentSteps 0
290 TestJSONOutput/start/parallel/IncreasingCurrentSteps 0
292 TestJSONOutput/pause/Command 0.77
293 TestJSONOutput/pause/Audit 0
295 TestJSONOutput/pause/parallel/DistinctCurrentSteps 0
296 TestJSONOutput/pause/parallel/IncreasingCurrentSteps 0
298 TestJSONOutput/unpause/Command 0.61
299 TestJSONOutput/unpause/Audit 0
301 TestJSONOutput/unpause/parallel/DistinctCurrentSteps 0
302 TestJSONOutput/unpause/parallel/IncreasingCurrentSteps 0
304 TestJSONOutput/stop/Command 5.99
305 TestJSONOutput/stop/Audit 0
307 TestJSONOutput/stop/parallel/DistinctCurrentSteps 0
308 TestJSONOutput/stop/parallel/IncreasingCurrentSteps 0
309 TestErrorJSONOutput 0.25
311 TestKicCustomNetwork/create_custom_network 41.57
312 TestKicCustomNetwork/use_default_bridge_network 35.99
313 TestKicExistingNetwork 38.85
314 TestKicCustomSubnet 37.65
315 TestKicStaticIP 39.09
316 TestMainNoArgs 0.05
317 TestMinikubeProfile 72.54
320 TestMountStart/serial/StartWithMountFirst 8.3
321 TestMountStart/serial/VerifyMountFirst 0.27
322 TestMountStart/serial/StartWithMountSecond 8.43
323 TestMountStart/serial/VerifyMountSecond 0.28
324 TestMountStart/serial/DeleteFirst 1.73
325 TestMountStart/serial/VerifyMountPostDelete 0.28
326 TestMountStart/serial/Stop 1.29
327 TestMountStart/serial/RestartStopped 7.33
328 TestMountStart/serial/VerifyMountPostStop 0.28
331 TestMultiNode/serial/FreshStart2Nodes 111.65
332 TestMultiNode/serial/DeployApp2Nodes 4.82
333 TestMultiNode/serial/PingHostFrom2Pods 0.96
334 TestMultiNode/serial/AddNode 27.69
335 TestMultiNode/serial/MultiNodeLabels 0.09
336 TestMultiNode/serial/ProfileList 0.73
337 TestMultiNode/serial/CopyFile 10.88
338 TestMultiNode/serial/StopNode 2.42
339 TestMultiNode/serial/StartAfterStop 7.94
340 TestMultiNode/serial/RestartKeepsNodes 78.67
341 TestMultiNode/serial/DeleteNode 5.75
342 TestMultiNode/serial/StopMultiNode 24.17
343 TestMultiNode/serial/RestartMultiNode 52.01
344 TestMultiNode/serial/ValidateNameConflict 37.79
349 TestPreload 119.87
351 TestScheduledStopUnix 111.36
354 TestInsufficientStorage 13.19
355 TestRunningBinaryUpgrade 64.15
358 TestMissingContainerUpgrade 124.68
360 TestNoKubernetes/serial/StartNoK8sWithVersion 0.09
361 TestNoKubernetes/serial/StartWithK8s 45.55
362 TestNoKubernetes/serial/StartWithStopK8s 24.17
363 TestNoKubernetes/serial/Start 7.64
364 TestNoKubernetes/serial/VerifyNok8sNoK8sDownloads 0
365 TestNoKubernetes/serial/VerifyK8sNotRunning 0.28
366 TestNoKubernetes/serial/ProfileList 0.76
367 TestNoKubernetes/serial/Stop 1.43
368 TestNoKubernetes/serial/StartNoArgs 7.25
369 TestNoKubernetes/serial/VerifyK8sNotRunningSecond 0.28
370 TestStoppedBinaryUpgrade/Setup 0.82
371 TestStoppedBinaryUpgrade/Upgrade 59.48
372 TestStoppedBinaryUpgrade/MinikubeLogs 1.5
381 TestPause/serial/Start 87.97
382 TestPause/serial/SecondStartNoReconfiguration 7.9
383 TestPause/serial/Pause 0.74
384 TestPause/serial/VerifyStatus 0.47
385 TestPause/serial/Unpause 0.74
386 TestPause/serial/PauseAgain 0.87
387 TestPause/serial/DeletePaused 2.72
388 TestPause/serial/VerifyDeletedResources 0.42
x
+
TestDownloadOnly/v1.28.0/json-events (6.3s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/json-events
aaa_download_only_test.go:80: (dbg) Run:  out/minikube-linux-arm64 start -o=json --download-only -p download-only-935257 --force --alsologtostderr --kubernetes-version=v1.28.0 --container-runtime=containerd --driver=docker  --container-runtime=containerd
aaa_download_only_test.go:80: (dbg) Done: out/minikube-linux-arm64 start -o=json --download-only -p download-only-935257 --force --alsologtostderr --kubernetes-version=v1.28.0 --container-runtime=containerd --driver=docker  --container-runtime=containerd: (6.304057876s)
--- PASS: TestDownloadOnly/v1.28.0/json-events (6.30s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.0/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/preload-exists
I1124 08:43:10.983403 1654467 preload.go:188] Checking if preload exists for k8s version v1.28.0 and runtime containerd
I1124 08:43:10.983486 1654467 preload.go:203] Found local preload: /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.0-containerd-overlay2-arm64.tar.lz4
--- PASS: TestDownloadOnly/v1.28.0/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.0/LogsDuration (0.1s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/LogsDuration
aaa_download_only_test.go:183: (dbg) Run:  out/minikube-linux-arm64 logs -p download-only-935257
aaa_download_only_test.go:183: (dbg) Non-zero exit: out/minikube-linux-arm64 logs -p download-only-935257: exit status 85 (95.565381ms)

                                                
                                                
-- stdout --
	
	==> Audit <==
	┌─────────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬──────────────────────┬─────────┬─────────┬─────────────────────┬──────────┐
	│ COMMAND │                                                                                         ARGS                                                                                          │       PROFILE        │  USER   │ VERSION │     START TIME      │ END TIME │
	├─────────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼──────────────────────┼─────────┼─────────┼─────────────────────┼──────────┤
	│ start   │ -o=json --download-only -p download-only-935257 --force --alsologtostderr --kubernetes-version=v1.28.0 --container-runtime=containerd --driver=docker  --container-runtime=containerd │ download-only-935257 │ jenkins │ v1.37.0 │ 24 Nov 25 08:43 UTC │          │
	└─────────┴───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴──────────────────────┴─────────┴─────────┴─────────────────────┴──────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/11/24 08:43:04
	Running on machine: ip-172-31-30-239
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1124 08:43:04.723113 1654473 out.go:360] Setting OutFile to fd 1 ...
	I1124 08:43:04.723288 1654473 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 08:43:04.723319 1654473 out.go:374] Setting ErrFile to fd 2...
	I1124 08:43:04.723341 1654473 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 08:43:04.723617 1654473 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
	W1124 08:43:04.723797 1654473 root.go:314] Error reading config file at /home/jenkins/minikube-integration/21978-1652607/.minikube/config/config.json: open /home/jenkins/minikube-integration/21978-1652607/.minikube/config/config.json: no such file or directory
	I1124 08:43:04.724216 1654473 out.go:368] Setting JSON to true
	I1124 08:43:04.725052 1654473 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":26714,"bootTime":1763947071,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1124 08:43:04.725142 1654473 start.go:143] virtualization:  
	I1124 08:43:04.728621 1654473 out.go:99] [download-only-935257] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	W1124 08:43:04.728757 1654473 preload.go:354] Failed to list preload files: open /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/preloaded-tarball: no such file or directory
	I1124 08:43:04.728894 1654473 notify.go:221] Checking for updates...
	I1124 08:43:04.730093 1654473 out.go:171] MINIKUBE_LOCATION=21978
	I1124 08:43:04.731462 1654473 out.go:171] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1124 08:43:04.732903 1654473 out.go:171] KUBECONFIG=/home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 08:43:04.734035 1654473 out.go:171] MINIKUBE_HOME=/home/jenkins/minikube-integration/21978-1652607/.minikube
	I1124 08:43:04.735129 1654473 out.go:171] MINIKUBE_BIN=out/minikube-linux-arm64
	W1124 08:43:04.737371 1654473 out.go:336] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I1124 08:43:04.737637 1654473 driver.go:422] Setting default libvirt URI to qemu:///system
	I1124 08:43:04.759368 1654473 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1124 08:43:04.759469 1654473 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 08:43:04.820733 1654473 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:1 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:29 OomKillDisable:true NGoroutines:61 SystemTime:2025-11-24 08:43:04.811925448 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 08:43:04.820840 1654473 docker.go:319] overlay module found
	I1124 08:43:04.822185 1654473 out.go:99] Using the docker driver based on user configuration
	I1124 08:43:04.822219 1654473 start.go:309] selected driver: docker
	I1124 08:43:04.822227 1654473 start.go:927] validating driver "docker" against <nil>
	I1124 08:43:04.822377 1654473 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 08:43:04.881530 1654473 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:1 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:29 OomKillDisable:true NGoroutines:61 SystemTime:2025-11-24 08:43:04.871306765 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 08:43:04.881721 1654473 start_flags.go:327] no existing cluster config was found, will generate one from the flags 
	I1124 08:43:04.882063 1654473 start_flags.go:410] Using suggested 3072MB memory alloc based on sys=7834MB, container=7834MB
	I1124 08:43:04.882242 1654473 start_flags.go:974] Wait components to verify : map[apiserver:true system_pods:true]
	I1124 08:43:04.883778 1654473 out.go:171] Using Docker driver with root privileges
	I1124 08:43:04.884926 1654473 cni.go:84] Creating CNI manager for ""
	I1124 08:43:04.884998 1654473 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1124 08:43:04.885011 1654473 start_flags.go:336] Found "CNI" CNI - setting NetworkPlugin=cni
	I1124 08:43:04.885088 1654473 start.go:353] cluster config:
	{Name:download-only-935257 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.28.0 ClusterName:download-only-935257 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Co
ntainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.28.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 08:43:04.886299 1654473 out.go:99] Starting "download-only-935257" primary control-plane node in "download-only-935257" cluster
	I1124 08:43:04.886324 1654473 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1124 08:43:04.887335 1654473 out.go:99] Pulling base image v0.0.48-1763789673-21948 ...
	I1124 08:43:04.887381 1654473 preload.go:188] Checking if preload exists for k8s version v1.28.0 and runtime containerd
	I1124 08:43:04.887548 1654473 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f in local docker daemon
	I1124 08:43:04.903294 1654473 cache.go:163] Downloading gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f to local cache
	I1124 08:43:04.903546 1654473 image.go:65] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f in local cache directory
	I1124 08:43:04.903647 1654473 image.go:150] Writing gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f to local cache
	I1124 08:43:04.939606 1654473 preload.go:148] Found remote preload: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.28.0/preloaded-images-k8s-v18-v1.28.0-containerd-overlay2-arm64.tar.lz4
	I1124 08:43:04.939649 1654473 cache.go:65] Caching tarball of preloaded images
	I1124 08:43:04.939894 1654473 preload.go:188] Checking if preload exists for k8s version v1.28.0 and runtime containerd
	I1124 08:43:04.941418 1654473 out.go:99] Downloading Kubernetes v1.28.0 preload ...
	I1124 08:43:04.941453 1654473 preload.go:318] getting checksum for preloaded-images-k8s-v18-v1.28.0-containerd-overlay2-arm64.tar.lz4 from gcs api...
	I1124 08:43:05.024556 1654473 preload.go:295] Got checksum from GCS API "38d7f581f2fa4226c8af2c9106b982b7"
	I1124 08:43:05.024721 1654473 download.go:108] Downloading: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.28.0/preloaded-images-k8s-v18-v1.28.0-containerd-overlay2-arm64.tar.lz4?checksum=md5:38d7f581f2fa4226c8af2c9106b982b7 -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.0-containerd-overlay2-arm64.tar.lz4
	
	
	* The control-plane node download-only-935257 host does not exist
	  To start a cluster, run: "minikube start -p download-only-935257"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:184: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.28.0/LogsDuration (0.10s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.0/DeleteAll (0.22s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/DeleteAll
aaa_download_only_test.go:196: (dbg) Run:  out/minikube-linux-arm64 delete --all
--- PASS: TestDownloadOnly/v1.28.0/DeleteAll (0.22s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.0/DeleteAlwaysSucceeds (0.14s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/DeleteAlwaysSucceeds
aaa_download_only_test.go:207: (dbg) Run:  out/minikube-linux-arm64 delete -p download-only-935257
--- PASS: TestDownloadOnly/v1.28.0/DeleteAlwaysSucceeds (0.14s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.2/json-events (7.26s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.2/json-events
aaa_download_only_test.go:80: (dbg) Run:  out/minikube-linux-arm64 start -o=json --download-only -p download-only-122245 --force --alsologtostderr --kubernetes-version=v1.34.2 --container-runtime=containerd --driver=docker  --container-runtime=containerd
aaa_download_only_test.go:80: (dbg) Done: out/minikube-linux-arm64 start -o=json --download-only -p download-only-122245 --force --alsologtostderr --kubernetes-version=v1.34.2 --container-runtime=containerd --driver=docker  --container-runtime=containerd: (7.262594328s)
--- PASS: TestDownloadOnly/v1.34.2/json-events (7.26s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.2/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.2/preload-exists
I1124 08:43:18.703094 1654467 preload.go:188] Checking if preload exists for k8s version v1.34.2 and runtime containerd
I1124 08:43:18.703136 1654467 preload.go:203] Found local preload: /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.2-containerd-overlay2-arm64.tar.lz4
--- PASS: TestDownloadOnly/v1.34.2/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.2/LogsDuration (0.09s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.2/LogsDuration
aaa_download_only_test.go:183: (dbg) Run:  out/minikube-linux-arm64 logs -p download-only-122245
aaa_download_only_test.go:183: (dbg) Non-zero exit: out/minikube-linux-arm64 logs -p download-only-122245: exit status 85 (90.270403ms)

                                                
                                                
-- stdout --
	
	==> Audit <==
	┌─────────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬──────────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                                         ARGS                                                                                          │       PROFILE        │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼──────────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ start   │ -o=json --download-only -p download-only-935257 --force --alsologtostderr --kubernetes-version=v1.28.0 --container-runtime=containerd --driver=docker  --container-runtime=containerd │ download-only-935257 │ jenkins │ v1.37.0 │ 24 Nov 25 08:43 UTC │                     │
	│ delete  │ --all                                                                                                                                                                                 │ minikube             │ jenkins │ v1.37.0 │ 24 Nov 25 08:43 UTC │ 24 Nov 25 08:43 UTC │
	│ delete  │ -p download-only-935257                                                                                                                                                               │ download-only-935257 │ jenkins │ v1.37.0 │ 24 Nov 25 08:43 UTC │ 24 Nov 25 08:43 UTC │
	│ start   │ -o=json --download-only -p download-only-122245 --force --alsologtostderr --kubernetes-version=v1.34.2 --container-runtime=containerd --driver=docker  --container-runtime=containerd │ download-only-122245 │ jenkins │ v1.37.0 │ 24 Nov 25 08:43 UTC │                     │
	└─────────┴───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴──────────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/11/24 08:43:11
	Running on machine: ip-172-31-30-239
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1124 08:43:11.483422 1654673 out.go:360] Setting OutFile to fd 1 ...
	I1124 08:43:11.483543 1654673 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 08:43:11.483553 1654673 out.go:374] Setting ErrFile to fd 2...
	I1124 08:43:11.483559 1654673 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 08:43:11.483817 1654673 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
	I1124 08:43:11.484213 1654673 out.go:368] Setting JSON to true
	I1124 08:43:11.485022 1654673 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":26721,"bootTime":1763947071,"procs":148,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1124 08:43:11.485088 1654673 start.go:143] virtualization:  
	I1124 08:43:11.488580 1654673 out.go:99] [download-only-122245] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1124 08:43:11.488820 1654673 notify.go:221] Checking for updates...
	I1124 08:43:11.491639 1654673 out.go:171] MINIKUBE_LOCATION=21978
	I1124 08:43:11.494640 1654673 out.go:171] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1124 08:43:11.497580 1654673 out.go:171] KUBECONFIG=/home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 08:43:11.500494 1654673 out.go:171] MINIKUBE_HOME=/home/jenkins/minikube-integration/21978-1652607/.minikube
	I1124 08:43:11.503383 1654673 out.go:171] MINIKUBE_BIN=out/minikube-linux-arm64
	W1124 08:43:11.509038 1654673 out.go:336] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I1124 08:43:11.509298 1654673 driver.go:422] Setting default libvirt URI to qemu:///system
	I1124 08:43:11.534531 1654673 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1124 08:43:11.534635 1654673 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 08:43:11.590143 1654673 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:1 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:29 OomKillDisable:true NGoroutines:49 SystemTime:2025-11-24 08:43:11.580999454 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 08:43:11.590245 1654673 docker.go:319] overlay module found
	I1124 08:43:11.593356 1654673 out.go:99] Using the docker driver based on user configuration
	I1124 08:43:11.593391 1654673 start.go:309] selected driver: docker
	I1124 08:43:11.593409 1654673 start.go:927] validating driver "docker" against <nil>
	I1124 08:43:11.593545 1654673 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 08:43:11.650363 1654673 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:1 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:29 OomKillDisable:true NGoroutines:49 SystemTime:2025-11-24 08:43:11.641697014 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 08:43:11.650546 1654673 start_flags.go:327] no existing cluster config was found, will generate one from the flags 
	I1124 08:43:11.650842 1654673 start_flags.go:410] Using suggested 3072MB memory alloc based on sys=7834MB, container=7834MB
	I1124 08:43:11.651001 1654673 start_flags.go:974] Wait components to verify : map[apiserver:true system_pods:true]
	I1124 08:43:11.653999 1654673 out.go:171] Using Docker driver with root privileges
	I1124 08:43:11.656755 1654673 cni.go:84] Creating CNI manager for ""
	I1124 08:43:11.656827 1654673 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1124 08:43:11.656841 1654673 start_flags.go:336] Found "CNI" CNI - setting NetworkPlugin=cni
	I1124 08:43:11.656917 1654673 start.go:353] cluster config:
	{Name:download-only-122245 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.2 ClusterName:download-only-122245 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Co
ntainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.34.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 08:43:11.659911 1654673 out.go:99] Starting "download-only-122245" primary control-plane node in "download-only-122245" cluster
	I1124 08:43:11.659928 1654673 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1124 08:43:11.662840 1654673 out.go:99] Pulling base image v0.0.48-1763789673-21948 ...
	I1124 08:43:11.662889 1654673 preload.go:188] Checking if preload exists for k8s version v1.34.2 and runtime containerd
	I1124 08:43:11.662977 1654673 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f in local docker daemon
	I1124 08:43:11.677837 1654673 cache.go:163] Downloading gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f to local cache
	I1124 08:43:11.677982 1654673 image.go:65] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f in local cache directory
	I1124 08:43:11.678001 1654673 image.go:68] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f in local cache directory, skipping pull
	I1124 08:43:11.678006 1654673 image.go:137] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f exists in cache, skipping pull
	I1124 08:43:11.678012 1654673 cache.go:166] successfully saved gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f as a tarball
	I1124 08:43:11.721414 1654673 preload.go:148] Found remote preload: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.34.2/preloaded-images-k8s-v18-v1.34.2-containerd-overlay2-arm64.tar.lz4
	I1124 08:43:11.721437 1654673 cache.go:65] Caching tarball of preloaded images
	I1124 08:43:11.721610 1654673 preload.go:188] Checking if preload exists for k8s version v1.34.2 and runtime containerd
	I1124 08:43:11.724626 1654673 out.go:99] Downloading Kubernetes v1.34.2 preload ...
	I1124 08:43:11.724653 1654673 preload.go:318] getting checksum for preloaded-images-k8s-v18-v1.34.2-containerd-overlay2-arm64.tar.lz4 from gcs api...
	I1124 08:43:11.813710 1654673 preload.go:295] Got checksum from GCS API "cd1a05d5493c9270e248bf47fb3f071d"
	I1124 08:43:11.813762 1654673 download.go:108] Downloading: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.34.2/preloaded-images-k8s-v18-v1.34.2-containerd-overlay2-arm64.tar.lz4?checksum=md5:cd1a05d5493c9270e248bf47fb3f071d -> /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.2-containerd-overlay2-arm64.tar.lz4
	
	
	* The control-plane node download-only-122245 host does not exist
	  To start a cluster, run: "minikube start -p download-only-122245"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:184: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.34.2/LogsDuration (0.09s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.2/DeleteAll (0.21s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.2/DeleteAll
aaa_download_only_test.go:196: (dbg) Run:  out/minikube-linux-arm64 delete --all
--- PASS: TestDownloadOnly/v1.34.2/DeleteAll (0.21s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.2/DeleteAlwaysSucceeds (0.13s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.2/DeleteAlwaysSucceeds
aaa_download_only_test.go:207: (dbg) Run:  out/minikube-linux-arm64 delete -p download-only-122245
--- PASS: TestDownloadOnly/v1.34.2/DeleteAlwaysSucceeds (0.13s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-beta.0/json-events (4.42s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-beta.0/json-events
aaa_download_only_test.go:80: (dbg) Run:  out/minikube-linux-arm64 start -o=json --download-only -p download-only-343972 --force --alsologtostderr --kubernetes-version=v1.35.0-beta.0 --container-runtime=containerd --driver=docker  --container-runtime=containerd
aaa_download_only_test.go:80: (dbg) Done: out/minikube-linux-arm64 start -o=json --download-only -p download-only-343972 --force --alsologtostderr --kubernetes-version=v1.35.0-beta.0 --container-runtime=containerd --driver=docker  --container-runtime=containerd: (4.417601774s)
--- PASS: TestDownloadOnly/v1.35.0-beta.0/json-events (4.42s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-beta.0/cached-images (0.44s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-beta.0/cached-images
I1124 08:43:23.704314 1654467 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
I1124 08:43:23.847066 1654467 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
I1124 08:43:23.991912 1654467 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
--- PASS: TestDownloadOnly/v1.35.0-beta.0/cached-images (0.44s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-beta.0/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-beta.0/binaries
--- PASS: TestDownloadOnly/v1.35.0-beta.0/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-beta.0/LogsDuration (0.08s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-beta.0/LogsDuration
aaa_download_only_test.go:183: (dbg) Run:  out/minikube-linux-arm64 logs -p download-only-343972
aaa_download_only_test.go:183: (dbg) Non-zero exit: out/minikube-linux-arm64 logs -p download-only-343972: exit status 85 (84.526435ms)

                                                
                                                
-- stdout --
	
	==> Audit <==
	┌─────────┬──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬──────────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                                             ARGS                                                                                             │       PROFILE        │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼──────────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ start   │ -o=json --download-only -p download-only-935257 --force --alsologtostderr --kubernetes-version=v1.28.0 --container-runtime=containerd --driver=docker  --container-runtime=containerd        │ download-only-935257 │ jenkins │ v1.37.0 │ 24 Nov 25 08:43 UTC │                     │
	│ delete  │ --all                                                                                                                                                                                        │ minikube             │ jenkins │ v1.37.0 │ 24 Nov 25 08:43 UTC │ 24 Nov 25 08:43 UTC │
	│ delete  │ -p download-only-935257                                                                                                                                                                      │ download-only-935257 │ jenkins │ v1.37.0 │ 24 Nov 25 08:43 UTC │ 24 Nov 25 08:43 UTC │
	│ start   │ -o=json --download-only -p download-only-122245 --force --alsologtostderr --kubernetes-version=v1.34.2 --container-runtime=containerd --driver=docker  --container-runtime=containerd        │ download-only-122245 │ jenkins │ v1.37.0 │ 24 Nov 25 08:43 UTC │                     │
	│ delete  │ --all                                                                                                                                                                                        │ minikube             │ jenkins │ v1.37.0 │ 24 Nov 25 08:43 UTC │ 24 Nov 25 08:43 UTC │
	│ delete  │ -p download-only-122245                                                                                                                                                                      │ download-only-122245 │ jenkins │ v1.37.0 │ 24 Nov 25 08:43 UTC │ 24 Nov 25 08:43 UTC │
	│ start   │ -o=json --download-only -p download-only-343972 --force --alsologtostderr --kubernetes-version=v1.35.0-beta.0 --container-runtime=containerd --driver=docker  --container-runtime=containerd │ download-only-343972 │ jenkins │ v1.37.0 │ 24 Nov 25 08:43 UTC │                     │
	└─────────┴──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴──────────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/11/24 08:43:19
	Running on machine: ip-172-31-30-239
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1124 08:43:19.183074 1654874 out.go:360] Setting OutFile to fd 1 ...
	I1124 08:43:19.183242 1654874 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 08:43:19.183272 1654874 out.go:374] Setting ErrFile to fd 2...
	I1124 08:43:19.183293 1654874 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 08:43:19.183571 1654874 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
	I1124 08:43:19.183995 1654874 out.go:368] Setting JSON to true
	I1124 08:43:19.184813 1654874 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":26729,"bootTime":1763947071,"procs":145,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1124 08:43:19.184944 1654874 start.go:143] virtualization:  
	I1124 08:43:19.188311 1654874 out.go:99] [download-only-343972] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1124 08:43:19.188546 1654874 notify.go:221] Checking for updates...
	I1124 08:43:19.191384 1654874 out.go:171] MINIKUBE_LOCATION=21978
	I1124 08:43:19.194374 1654874 out.go:171] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1124 08:43:19.197239 1654874 out.go:171] KUBECONFIG=/home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 08:43:19.200087 1654874 out.go:171] MINIKUBE_HOME=/home/jenkins/minikube-integration/21978-1652607/.minikube
	I1124 08:43:19.203014 1654874 out.go:171] MINIKUBE_BIN=out/minikube-linux-arm64
	W1124 08:43:19.208893 1654874 out.go:336] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I1124 08:43:19.209261 1654874 driver.go:422] Setting default libvirt URI to qemu:///system
	I1124 08:43:19.241556 1654874 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1124 08:43:19.241683 1654874 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 08:43:19.297665 1654874 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:1 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:29 OomKillDisable:true NGoroutines:47 SystemTime:2025-11-24 08:43:19.287928016 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 08:43:19.297779 1654874 docker.go:319] overlay module found
	I1124 08:43:19.300847 1654874 out.go:99] Using the docker driver based on user configuration
	I1124 08:43:19.300900 1654874 start.go:309] selected driver: docker
	I1124 08:43:19.300909 1654874 start.go:927] validating driver "docker" against <nil>
	I1124 08:43:19.301029 1654874 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 08:43:19.355330 1654874 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:1 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:29 OomKillDisable:true NGoroutines:47 SystemTime:2025-11-24 08:43:19.34621239 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aa
rch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pa
th:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 08:43:19.355532 1654874 start_flags.go:327] no existing cluster config was found, will generate one from the flags 
	I1124 08:43:19.355812 1654874 start_flags.go:410] Using suggested 3072MB memory alloc based on sys=7834MB, container=7834MB
	I1124 08:43:19.355958 1654874 start_flags.go:974] Wait components to verify : map[apiserver:true system_pods:true]
	I1124 08:43:19.359038 1654874 out.go:171] Using Docker driver with root privileges
	
	
	* The control-plane node download-only-343972 host does not exist
	  To start a cluster, run: "minikube start -p download-only-343972"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:184: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.35.0-beta.0/LogsDuration (0.08s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-beta.0/DeleteAll (0.21s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-beta.0/DeleteAll
aaa_download_only_test.go:196: (dbg) Run:  out/minikube-linux-arm64 delete --all
--- PASS: TestDownloadOnly/v1.35.0-beta.0/DeleteAll (0.21s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-beta.0/DeleteAlwaysSucceeds (0.14s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-beta.0/DeleteAlwaysSucceeds
aaa_download_only_test.go:207: (dbg) Run:  out/minikube-linux-arm64 delete -p download-only-343972
--- PASS: TestDownloadOnly/v1.35.0-beta.0/DeleteAlwaysSucceeds (0.14s)

                                                
                                    
x
+
TestBinaryMirror (0.62s)

                                                
                                                
=== RUN   TestBinaryMirror
I1124 08:43:25.423966 1654467 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubectl?checksum=file:https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubectl.sha256
aaa_download_only_test.go:309: (dbg) Run:  out/minikube-linux-arm64 start --download-only -p binary-mirror-539266 --alsologtostderr --binary-mirror http://127.0.0.1:44655 --driver=docker  --container-runtime=containerd
helpers_test.go:175: Cleaning up "binary-mirror-539266" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p binary-mirror-539266
--- PASS: TestBinaryMirror (0.62s)

                                                
                                    
x
+
TestAddons/PreSetup/EnablingAddonOnNonExistingCluster (0.08s)

                                                
                                                
=== RUN   TestAddons/PreSetup/EnablingAddonOnNonExistingCluster
=== PAUSE TestAddons/PreSetup/EnablingAddonOnNonExistingCluster

                                                
                                                

                                                
                                                
=== CONT  TestAddons/PreSetup/EnablingAddonOnNonExistingCluster
addons_test.go:1000: (dbg) Run:  out/minikube-linux-arm64 addons enable dashboard -p addons-674149
addons_test.go:1000: (dbg) Non-zero exit: out/minikube-linux-arm64 addons enable dashboard -p addons-674149: exit status 85 (82.508593ms)

                                                
                                                
-- stdout --
	* Profile "addons-674149" not found. Run "minikube profile list" to view all profiles.
	  To start a cluster, run: "minikube start -p addons-674149"

                                                
                                                
-- /stdout --
--- PASS: TestAddons/PreSetup/EnablingAddonOnNonExistingCluster (0.08s)

                                                
                                    
x
+
TestAddons/PreSetup/DisablingAddonOnNonExistingCluster (0.08s)

                                                
                                                
=== RUN   TestAddons/PreSetup/DisablingAddonOnNonExistingCluster
=== PAUSE TestAddons/PreSetup/DisablingAddonOnNonExistingCluster

                                                
                                                

                                                
                                                
=== CONT  TestAddons/PreSetup/DisablingAddonOnNonExistingCluster
addons_test.go:1011: (dbg) Run:  out/minikube-linux-arm64 addons disable dashboard -p addons-674149
addons_test.go:1011: (dbg) Non-zero exit: out/minikube-linux-arm64 addons disable dashboard -p addons-674149: exit status 85 (75.755195ms)

                                                
                                                
-- stdout --
	* Profile "addons-674149" not found. Run "minikube profile list" to view all profiles.
	  To start a cluster, run: "minikube start -p addons-674149"

                                                
                                                
-- /stdout --
--- PASS: TestAddons/PreSetup/DisablingAddonOnNonExistingCluster (0.08s)

                                                
                                    
x
+
TestAddons/Setup (157.44s)

                                                
                                                
=== RUN   TestAddons/Setup
addons_test.go:108: (dbg) Run:  out/minikube-linux-arm64 start -p addons-674149 --wait=true --memory=4096 --alsologtostderr --addons=registry --addons=registry-creds --addons=metrics-server --addons=volumesnapshots --addons=csi-hostpath-driver --addons=gcp-auth --addons=cloud-spanner --addons=inspektor-gadget --addons=nvidia-device-plugin --addons=yakd --addons=volcano --addons=amd-gpu-device-plugin --driver=docker  --container-runtime=containerd --addons=ingress --addons=ingress-dns --addons=storage-provisioner-rancher
addons_test.go:108: (dbg) Done: out/minikube-linux-arm64 start -p addons-674149 --wait=true --memory=4096 --alsologtostderr --addons=registry --addons=registry-creds --addons=metrics-server --addons=volumesnapshots --addons=csi-hostpath-driver --addons=gcp-auth --addons=cloud-spanner --addons=inspektor-gadget --addons=nvidia-device-plugin --addons=yakd --addons=volcano --addons=amd-gpu-device-plugin --driver=docker  --container-runtime=containerd --addons=ingress --addons=ingress-dns --addons=storage-provisioner-rancher: (2m37.438358129s)
--- PASS: TestAddons/Setup (157.44s)

                                                
                                    
x
+
TestAddons/serial/Volcano (41.86s)

                                                
                                                
=== RUN   TestAddons/serial/Volcano
addons_test.go:884: volcano-controller stabilized in 56.344811ms
addons_test.go:876: volcano-admission stabilized in 56.63892ms
addons_test.go:868: volcano-scheduler stabilized in 57.130981ms
addons_test.go:890: (dbg) TestAddons/serial/Volcano: waiting 6m0s for pods matching "app=volcano-scheduler" in namespace "volcano-system" ...
helpers_test.go:352: "volcano-scheduler-76c996c8bf-msf7l" [8f848491-15d4-47a7-bedf-7769e5a3a3c8] Running
addons_test.go:890: (dbg) TestAddons/serial/Volcano: app=volcano-scheduler healthy within 6.003466207s
addons_test.go:894: (dbg) TestAddons/serial/Volcano: waiting 6m0s for pods matching "app=volcano-admission" in namespace "volcano-system" ...
helpers_test.go:352: "volcano-admission-6c447bd768-np85w" [769c21f2-f747-4285-b274-1d2dc2bc1cc9] Running
addons_test.go:894: (dbg) TestAddons/serial/Volcano: app=volcano-admission healthy within 5.003687319s
addons_test.go:898: (dbg) TestAddons/serial/Volcano: waiting 6m0s for pods matching "app=volcano-controller" in namespace "volcano-system" ...
helpers_test.go:352: "volcano-controllers-6fd4f85cb8-bfsdz" [d90485e3-a07f-45e3-b758-e7180c5bb745] Running
addons_test.go:898: (dbg) TestAddons/serial/Volcano: app=volcano-controller healthy within 5.003492136s
addons_test.go:903: (dbg) Run:  kubectl --context addons-674149 delete -n volcano-system job volcano-admission-init
addons_test.go:909: (dbg) Run:  kubectl --context addons-674149 create -f testdata/vcjob.yaml
addons_test.go:917: (dbg) Run:  kubectl --context addons-674149 get vcjob -n my-volcano
addons_test.go:935: (dbg) TestAddons/serial/Volcano: waiting 3m0s for pods matching "volcano.sh/job-name=test-job" in namespace "my-volcano" ...
helpers_test.go:352: "test-job-nginx-0" [453dfe8f-ebe5-4e64-a108-1f8ad7bc141b] Pending
helpers_test.go:352: "test-job-nginx-0" [453dfe8f-ebe5-4e64-a108-1f8ad7bc141b] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:352: "test-job-nginx-0" [453dfe8f-ebe5-4e64-a108-1f8ad7bc141b] Running
addons_test.go:935: (dbg) TestAddons/serial/Volcano: volcano.sh/job-name=test-job healthy within 13.003665s
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-674149 addons disable volcano --alsologtostderr -v=1
addons_test.go:1053: (dbg) Done: out/minikube-linux-arm64 -p addons-674149 addons disable volcano --alsologtostderr -v=1: (12.076770931s)
--- PASS: TestAddons/serial/Volcano (41.86s)

                                                
                                    
x
+
TestAddons/serial/GCPAuth/Namespaces (0.18s)

                                                
                                                
=== RUN   TestAddons/serial/GCPAuth/Namespaces
addons_test.go:630: (dbg) Run:  kubectl --context addons-674149 create ns new-namespace
addons_test.go:644: (dbg) Run:  kubectl --context addons-674149 get secret gcp-auth -n new-namespace
--- PASS: TestAddons/serial/GCPAuth/Namespaces (0.18s)

                                                
                                    
x
+
TestAddons/serial/GCPAuth/FakeCredentials (8.83s)

                                                
                                                
=== RUN   TestAddons/serial/GCPAuth/FakeCredentials
addons_test.go:675: (dbg) Run:  kubectl --context addons-674149 create -f testdata/busybox.yaml
addons_test.go:682: (dbg) Run:  kubectl --context addons-674149 create sa gcp-auth-test
addons_test.go:688: (dbg) TestAddons/serial/GCPAuth/FakeCredentials: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:352: "busybox" [3915895b-8f3d-4386-a43c-539104d2a4a6] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:352: "busybox" [3915895b-8f3d-4386-a43c-539104d2a4a6] Running
addons_test.go:688: (dbg) TestAddons/serial/GCPAuth/FakeCredentials: integration-test=busybox healthy within 8.004220124s
addons_test.go:694: (dbg) Run:  kubectl --context addons-674149 exec busybox -- /bin/sh -c "printenv GOOGLE_APPLICATION_CREDENTIALS"
addons_test.go:706: (dbg) Run:  kubectl --context addons-674149 describe sa gcp-auth-test
addons_test.go:720: (dbg) Run:  kubectl --context addons-674149 exec busybox -- /bin/sh -c "cat /google-app-creds.json"
addons_test.go:744: (dbg) Run:  kubectl --context addons-674149 exec busybox -- /bin/sh -c "printenv GOOGLE_CLOUD_PROJECT"
--- PASS: TestAddons/serial/GCPAuth/FakeCredentials (8.83s)

                                                
                                    
x
+
TestAddons/parallel/Registry (16.22s)

                                                
                                                
=== RUN   TestAddons/parallel/Registry
=== PAUSE TestAddons/parallel/Registry

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Registry
addons_test.go:382: registry stabilized in 15.500441ms
addons_test.go:384: (dbg) TestAddons/parallel/Registry: waiting 6m0s for pods matching "actual-registry=true" in namespace "kube-system" ...
helpers_test.go:352: "registry-6b586f9694-dlvbb" [bc6d1721-0552-4008-a2c9-c95ed3408c81] Running
addons_test.go:384: (dbg) TestAddons/parallel/Registry: actual-registry=true healthy within 5.004730374s
addons_test.go:387: (dbg) TestAddons/parallel/Registry: waiting 10m0s for pods matching "registry-proxy=true" in namespace "kube-system" ...
helpers_test.go:352: "registry-proxy-9kgmn" [f317f307-5762-4134-a0d0-d3ccd22e9b5d] Running
addons_test.go:387: (dbg) TestAddons/parallel/Registry: registry-proxy=true healthy within 6.003206612s
addons_test.go:392: (dbg) Run:  kubectl --context addons-674149 delete po -l run=registry-test --now
addons_test.go:397: (dbg) Run:  kubectl --context addons-674149 run --rm registry-test --restart=Never --image=gcr.io/k8s-minikube/busybox -it -- sh -c "wget --spider -S http://registry.kube-system.svc.cluster.local"
addons_test.go:397: (dbg) Done: kubectl --context addons-674149 run --rm registry-test --restart=Never --image=gcr.io/k8s-minikube/busybox -it -- sh -c "wget --spider -S http://registry.kube-system.svc.cluster.local": (4.090276333s)
addons_test.go:411: (dbg) Run:  out/minikube-linux-arm64 -p addons-674149 ip
2025/11/24 08:47:19 [DEBUG] GET http://192.168.49.2:5000
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-674149 addons disable registry --alsologtostderr -v=1
--- PASS: TestAddons/parallel/Registry (16.22s)

                                                
                                    
x
+
TestAddons/parallel/RegistryCreds (0.76s)

                                                
                                                
=== RUN   TestAddons/parallel/RegistryCreds
=== PAUSE TestAddons/parallel/RegistryCreds

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/RegistryCreds
addons_test.go:323: registry-creds stabilized in 3.593824ms
addons_test.go:325: (dbg) Run:  out/minikube-linux-arm64 addons configure registry-creds -f ./testdata/addons_testconfig.json -p addons-674149
addons_test.go:332: (dbg) Run:  kubectl --context addons-674149 -n kube-system get secret -o yaml
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-674149 addons disable registry-creds --alsologtostderr -v=1
--- PASS: TestAddons/parallel/RegistryCreds (0.76s)

                                                
                                    
x
+
TestAddons/parallel/Ingress (20.54s)

                                                
                                                
=== RUN   TestAddons/parallel/Ingress
=== PAUSE TestAddons/parallel/Ingress

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Ingress
addons_test.go:209: (dbg) Run:  kubectl --context addons-674149 wait --for=condition=ready --namespace=ingress-nginx pod --selector=app.kubernetes.io/component=controller --timeout=90s
addons_test.go:234: (dbg) Run:  kubectl --context addons-674149 replace --force -f testdata/nginx-ingress-v1.yaml
addons_test.go:247: (dbg) Run:  kubectl --context addons-674149 replace --force -f testdata/nginx-pod-svc.yaml
addons_test.go:252: (dbg) TestAddons/parallel/Ingress: waiting 8m0s for pods matching "run=nginx" in namespace "default" ...
helpers_test.go:352: "nginx" [9fe4b623-3d80-4ab8-b8d0-8de70e6b5bf0] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:352: "nginx" [9fe4b623-3d80-4ab8-b8d0-8de70e6b5bf0] Running
addons_test.go:252: (dbg) TestAddons/parallel/Ingress: run=nginx healthy within 8.004155536s
I1124 08:47:47.390817 1654467 kapi.go:150] Service nginx in namespace default found.
addons_test.go:264: (dbg) Run:  out/minikube-linux-arm64 -p addons-674149 ssh "curl -s http://127.0.0.1/ -H 'Host: nginx.example.com'"
addons_test.go:288: (dbg) Run:  kubectl --context addons-674149 replace --force -f testdata/ingress-dns-example-v1.yaml
addons_test.go:293: (dbg) Run:  out/minikube-linux-arm64 -p addons-674149 ip
addons_test.go:299: (dbg) Run:  nslookup hello-john.test 192.168.49.2
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-674149 addons disable ingress-dns --alsologtostderr -v=1
addons_test.go:1053: (dbg) Done: out/minikube-linux-arm64 -p addons-674149 addons disable ingress-dns --alsologtostderr -v=1: (2.266676143s)
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-674149 addons disable ingress --alsologtostderr -v=1
addons_test.go:1053: (dbg) Done: out/minikube-linux-arm64 -p addons-674149 addons disable ingress --alsologtostderr -v=1: (8.036342907s)
--- PASS: TestAddons/parallel/Ingress (20.54s)

                                                
                                    
x
+
TestAddons/parallel/InspektorGadget (11.07s)

                                                
                                                
=== RUN   TestAddons/parallel/InspektorGadget
=== PAUSE TestAddons/parallel/InspektorGadget

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/InspektorGadget
addons_test.go:823: (dbg) TestAddons/parallel/InspektorGadget: waiting 8m0s for pods matching "k8s-app=gadget" in namespace "gadget" ...
helpers_test.go:352: "gadget-md8j9" [0f4a625e-b73b-4d1d-990d-9f2f5132724e] Running
addons_test.go:823: (dbg) TestAddons/parallel/InspektorGadget: k8s-app=gadget healthy within 5.00853369s
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-674149 addons disable inspektor-gadget --alsologtostderr -v=1
addons_test.go:1053: (dbg) Done: out/minikube-linux-arm64 -p addons-674149 addons disable inspektor-gadget --alsologtostderr -v=1: (6.057147163s)
--- PASS: TestAddons/parallel/InspektorGadget (11.07s)

                                                
                                    
x
+
TestAddons/parallel/MetricsServer (6.81s)

                                                
                                                
=== RUN   TestAddons/parallel/MetricsServer
=== PAUSE TestAddons/parallel/MetricsServer

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/MetricsServer
addons_test.go:455: metrics-server stabilized in 4.901489ms
addons_test.go:457: (dbg) TestAddons/parallel/MetricsServer: waiting 6m0s for pods matching "k8s-app=metrics-server" in namespace "kube-system" ...
helpers_test.go:352: "metrics-server-85b7d694d7-b9zbj" [a7c3b53c-5165-4497-928f-eb4e87ef1d5d] Running
addons_test.go:457: (dbg) TestAddons/parallel/MetricsServer: k8s-app=metrics-server healthy within 6.002703305s
addons_test.go:463: (dbg) Run:  kubectl --context addons-674149 top pods -n kube-system
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-674149 addons disable metrics-server --alsologtostderr -v=1
--- PASS: TestAddons/parallel/MetricsServer (6.81s)

                                                
                                    
x
+
TestAddons/parallel/CSI (36.08s)

                                                
                                                
=== RUN   TestAddons/parallel/CSI
=== PAUSE TestAddons/parallel/CSI

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/CSI
I1124 08:47:19.881765 1654467 kapi.go:75] Waiting for pod with label "kubernetes.io/minikube-addons=csi-hostpath-driver" in ns "kube-system" ...
I1124 08:47:19.885339 1654467 kapi.go:86] Found 3 Pods for label selector kubernetes.io/minikube-addons=csi-hostpath-driver
I1124 08:47:19.885364 1654467 kapi.go:107] duration metric: took 6.335865ms to wait for kubernetes.io/minikube-addons=csi-hostpath-driver ...
addons_test.go:549: csi-hostpath-driver pods stabilized in 6.345768ms
addons_test.go:552: (dbg) Run:  kubectl --context addons-674149 create -f testdata/csi-hostpath-driver/pvc.yaml
addons_test.go:557: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pvc "hpvc" in namespace "default" ...
helpers_test.go:402: (dbg) Run:  kubectl --context addons-674149 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:402: (dbg) Run:  kubectl --context addons-674149 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:402: (dbg) Run:  kubectl --context addons-674149 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:402: (dbg) Run:  kubectl --context addons-674149 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:402: (dbg) Run:  kubectl --context addons-674149 get pvc hpvc -o jsonpath={.status.phase} -n default
addons_test.go:562: (dbg) Run:  kubectl --context addons-674149 create -f testdata/csi-hostpath-driver/pv-pod.yaml
addons_test.go:567: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pods matching "app=task-pv-pod" in namespace "default" ...
helpers_test.go:352: "task-pv-pod" [f6534211-b03e-40b9-8f1c-0de672019729] Pending
helpers_test.go:352: "task-pv-pod" [f6534211-b03e-40b9-8f1c-0de672019729] Pending / Ready:ContainersNotReady (containers with unready status: [task-pv-container]) / ContainersReady:ContainersNotReady (containers with unready status: [task-pv-container])
helpers_test.go:352: "task-pv-pod" [f6534211-b03e-40b9-8f1c-0de672019729] Running
addons_test.go:567: (dbg) TestAddons/parallel/CSI: app=task-pv-pod healthy within 7.003949035s
addons_test.go:572: (dbg) Run:  kubectl --context addons-674149 create -f testdata/csi-hostpath-driver/snapshot.yaml
addons_test.go:577: (dbg) TestAddons/parallel/CSI: waiting 6m0s for volume snapshot "new-snapshot-demo" in namespace "default" ...
helpers_test.go:427: (dbg) Run:  kubectl --context addons-674149 get volumesnapshot new-snapshot-demo -o jsonpath={.status.readyToUse} -n default
helpers_test.go:435: TestAddons/parallel/CSI: WARNING: volume snapshot get for "default" "new-snapshot-demo" returned: 
helpers_test.go:427: (dbg) Run:  kubectl --context addons-674149 get volumesnapshot new-snapshot-demo -o jsonpath={.status.readyToUse} -n default
addons_test.go:582: (dbg) Run:  kubectl --context addons-674149 delete pod task-pv-pod
addons_test.go:582: (dbg) Done: kubectl --context addons-674149 delete pod task-pv-pod: (1.286225352s)
addons_test.go:588: (dbg) Run:  kubectl --context addons-674149 delete pvc hpvc
addons_test.go:594: (dbg) Run:  kubectl --context addons-674149 create -f testdata/csi-hostpath-driver/pvc-restore.yaml
addons_test.go:599: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pvc "hpvc-restore" in namespace "default" ...
helpers_test.go:402: (dbg) Run:  kubectl --context addons-674149 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:402: (dbg) Run:  kubectl --context addons-674149 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:402: (dbg) Run:  kubectl --context addons-674149 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:402: (dbg) Run:  kubectl --context addons-674149 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:402: (dbg) Run:  kubectl --context addons-674149 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
addons_test.go:604: (dbg) Run:  kubectl --context addons-674149 create -f testdata/csi-hostpath-driver/pv-pod-restore.yaml
addons_test.go:609: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pods matching "app=task-pv-pod-restore" in namespace "default" ...
helpers_test.go:352: "task-pv-pod-restore" [4489def3-ae96-42a0-bd9c-7d7d79231b7a] Pending
helpers_test.go:352: "task-pv-pod-restore" [4489def3-ae96-42a0-bd9c-7d7d79231b7a] Pending / Ready:ContainersNotReady (containers with unready status: [task-pv-container]) / ContainersReady:ContainersNotReady (containers with unready status: [task-pv-container])
helpers_test.go:352: "task-pv-pod-restore" [4489def3-ae96-42a0-bd9c-7d7d79231b7a] Running
addons_test.go:609: (dbg) TestAddons/parallel/CSI: app=task-pv-pod-restore healthy within 7.003657445s
addons_test.go:614: (dbg) Run:  kubectl --context addons-674149 delete pod task-pv-pod-restore
addons_test.go:614: (dbg) Done: kubectl --context addons-674149 delete pod task-pv-pod-restore: (1.321279783s)
addons_test.go:618: (dbg) Run:  kubectl --context addons-674149 delete pvc hpvc-restore
addons_test.go:622: (dbg) Run:  kubectl --context addons-674149 delete volumesnapshot new-snapshot-demo
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-674149 addons disable volumesnapshots --alsologtostderr -v=1
addons_test.go:1053: (dbg) Done: out/minikube-linux-arm64 -p addons-674149 addons disable volumesnapshots --alsologtostderr -v=1: (1.257140512s)
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-674149 addons disable csi-hostpath-driver --alsologtostderr -v=1
addons_test.go:1053: (dbg) Done: out/minikube-linux-arm64 -p addons-674149 addons disable csi-hostpath-driver --alsologtostderr -v=1: (7.399995568s)
--- PASS: TestAddons/parallel/CSI (36.08s)

                                                
                                    
x
+
TestAddons/parallel/Headlamp (17.11s)

                                                
                                                
=== RUN   TestAddons/parallel/Headlamp
=== PAUSE TestAddons/parallel/Headlamp

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Headlamp
addons_test.go:808: (dbg) Run:  out/minikube-linux-arm64 addons enable headlamp -p addons-674149 --alsologtostderr -v=1
addons_test.go:808: (dbg) Done: out/minikube-linux-arm64 addons enable headlamp -p addons-674149 --alsologtostderr -v=1: (1.090767297s)
addons_test.go:813: (dbg) TestAddons/parallel/Headlamp: waiting 8m0s for pods matching "app.kubernetes.io/name=headlamp" in namespace "headlamp" ...
helpers_test.go:352: "headlamp-dfcdc64b-p5lbz" [c6d93cee-1e6c-4b2b-9622-d5aa59770ea4] Pending / Ready:ContainersNotReady (containers with unready status: [headlamp]) / ContainersReady:ContainersNotReady (containers with unready status: [headlamp])
helpers_test.go:352: "headlamp-dfcdc64b-p5lbz" [c6d93cee-1e6c-4b2b-9622-d5aa59770ea4] Running
addons_test.go:813: (dbg) TestAddons/parallel/Headlamp: app.kubernetes.io/name=headlamp healthy within 10.003827991s
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-674149 addons disable headlamp --alsologtostderr -v=1
addons_test.go:1053: (dbg) Done: out/minikube-linux-arm64 -p addons-674149 addons disable headlamp --alsologtostderr -v=1: (6.013026046s)
--- PASS: TestAddons/parallel/Headlamp (17.11s)

                                                
                                    
x
+
TestAddons/parallel/CloudSpanner (5.66s)

                                                
                                                
=== RUN   TestAddons/parallel/CloudSpanner
=== PAUSE TestAddons/parallel/CloudSpanner

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/CloudSpanner
addons_test.go:840: (dbg) TestAddons/parallel/CloudSpanner: waiting 6m0s for pods matching "app=cloud-spanner-emulator" in namespace "default" ...
helpers_test.go:352: "cloud-spanner-emulator-5bdddb765-h7zgt" [f557c851-71df-4d14-b7c3-eed6a1351af3] Running
addons_test.go:840: (dbg) TestAddons/parallel/CloudSpanner: app=cloud-spanner-emulator healthy within 5.003943096s
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-674149 addons disable cloud-spanner --alsologtostderr -v=1
--- PASS: TestAddons/parallel/CloudSpanner (5.66s)

                                                
                                    
x
+
TestAddons/parallel/NvidiaDevicePlugin (5.55s)

                                                
                                                
=== RUN   TestAddons/parallel/NvidiaDevicePlugin
=== PAUSE TestAddons/parallel/NvidiaDevicePlugin

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/NvidiaDevicePlugin
addons_test.go:1025: (dbg) TestAddons/parallel/NvidiaDevicePlugin: waiting 6m0s for pods matching "name=nvidia-device-plugin-ds" in namespace "kube-system" ...
helpers_test.go:352: "nvidia-device-plugin-daemonset-7dms5" [66c6813c-89d6-4546-b6ab-601ab40673c5] Running
addons_test.go:1025: (dbg) TestAddons/parallel/NvidiaDevicePlugin: name=nvidia-device-plugin-ds healthy within 5.004510898s
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-674149 addons disable nvidia-device-plugin --alsologtostderr -v=1
--- PASS: TestAddons/parallel/NvidiaDevicePlugin (5.55s)

                                                
                                    
x
+
TestAddons/parallel/Yakd (10.87s)

                                                
                                                
=== RUN   TestAddons/parallel/Yakd
=== PAUSE TestAddons/parallel/Yakd

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Yakd
addons_test.go:1047: (dbg) TestAddons/parallel/Yakd: waiting 2m0s for pods matching "app.kubernetes.io/name=yakd-dashboard" in namespace "yakd-dashboard" ...
helpers_test.go:352: "yakd-dashboard-5ff678cb9-qf4sn" [88b6fdde-23fe-4465-9b45-25e51309e33c] Running
addons_test.go:1047: (dbg) TestAddons/parallel/Yakd: app.kubernetes.io/name=yakd-dashboard healthy within 5.003586253s
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-674149 addons disable yakd --alsologtostderr -v=1
addons_test.go:1053: (dbg) Done: out/minikube-linux-arm64 -p addons-674149 addons disable yakd --alsologtostderr -v=1: (5.869091483s)
--- PASS: TestAddons/parallel/Yakd (10.87s)

                                                
                                    
x
+
TestAddons/StoppedEnableDisable (12.63s)

                                                
                                                
=== RUN   TestAddons/StoppedEnableDisable
addons_test.go:172: (dbg) Run:  out/minikube-linux-arm64 stop -p addons-674149
addons_test.go:172: (dbg) Done: out/minikube-linux-arm64 stop -p addons-674149: (12.277813222s)
addons_test.go:176: (dbg) Run:  out/minikube-linux-arm64 addons enable dashboard -p addons-674149
addons_test.go:180: (dbg) Run:  out/minikube-linux-arm64 addons disable dashboard -p addons-674149
addons_test.go:185: (dbg) Run:  out/minikube-linux-arm64 addons disable gvisor -p addons-674149
--- PASS: TestAddons/StoppedEnableDisable (12.63s)

                                                
                                    
x
+
TestCertOptions (38.37s)

                                                
                                                
=== RUN   TestCertOptions
=== PAUSE TestCertOptions

                                                
                                                

                                                
                                                
=== CONT  TestCertOptions
cert_options_test.go:49: (dbg) Run:  out/minikube-linux-arm64 start -p cert-options-477984 --memory=3072 --apiserver-ips=127.0.0.1 --apiserver-ips=192.168.15.15 --apiserver-names=localhost --apiserver-names=www.google.com --apiserver-port=8555 --driver=docker  --container-runtime=containerd
cert_options_test.go:49: (dbg) Done: out/minikube-linux-arm64 start -p cert-options-477984 --memory=3072 --apiserver-ips=127.0.0.1 --apiserver-ips=192.168.15.15 --apiserver-names=localhost --apiserver-names=www.google.com --apiserver-port=8555 --driver=docker  --container-runtime=containerd: (35.525350696s)
cert_options_test.go:60: (dbg) Run:  out/minikube-linux-arm64 -p cert-options-477984 ssh "openssl x509 -text -noout -in /var/lib/minikube/certs/apiserver.crt"
cert_options_test.go:88: (dbg) Run:  kubectl --context cert-options-477984 config view
cert_options_test.go:100: (dbg) Run:  out/minikube-linux-arm64 ssh -p cert-options-477984 -- "sudo cat /etc/kubernetes/admin.conf"
helpers_test.go:175: Cleaning up "cert-options-477984" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p cert-options-477984
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p cert-options-477984: (2.068000818s)
--- PASS: TestCertOptions (38.37s)

                                                
                                    
x
+
TestCertExpiration (220.98s)

                                                
                                                
=== RUN   TestCertExpiration
=== PAUSE TestCertExpiration

                                                
                                                

                                                
                                                
=== CONT  TestCertExpiration
cert_options_test.go:123: (dbg) Run:  out/minikube-linux-arm64 start -p cert-expiration-991536 --memory=3072 --cert-expiration=3m --driver=docker  --container-runtime=containerd
E1124 10:16:17.586634 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 10:16:24.716923 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
cert_options_test.go:123: (dbg) Done: out/minikube-linux-arm64 start -p cert-expiration-991536 --memory=3072 --cert-expiration=3m --driver=docker  --container-runtime=containerd: (30.863955687s)
E1124 10:17:47.792419 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 10:18:14.515630 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 10:19:06.679681 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
cert_options_test.go:131: (dbg) Run:  out/minikube-linux-arm64 start -p cert-expiration-991536 --memory=3072 --cert-expiration=8760h --driver=docker  --container-runtime=containerd
cert_options_test.go:131: (dbg) Done: out/minikube-linux-arm64 start -p cert-expiration-991536 --memory=3072 --cert-expiration=8760h --driver=docker  --container-runtime=containerd: (7.72632349s)
helpers_test.go:175: Cleaning up "cert-expiration-991536" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p cert-expiration-991536
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p cert-expiration-991536: (2.386571751s)
--- PASS: TestCertExpiration (220.98s)

                                                
                                    
x
+
TestForceSystemdFlag (40.59s)

                                                
                                                
=== RUN   TestForceSystemdFlag
=== PAUSE TestForceSystemdFlag

                                                
                                                

                                                
                                                
=== CONT  TestForceSystemdFlag
docker_test.go:91: (dbg) Run:  out/minikube-linux-arm64 start -p force-systemd-flag-380631 --memory=3072 --force-systemd --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd
docker_test.go:91: (dbg) Done: out/minikube-linux-arm64 start -p force-systemd-flag-380631 --memory=3072 --force-systemd --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd: (38.000889709s)
docker_test.go:121: (dbg) Run:  out/minikube-linux-arm64 -p force-systemd-flag-380631 ssh "cat /etc/containerd/config.toml"
helpers_test.go:175: Cleaning up "force-systemd-flag-380631" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p force-systemd-flag-380631
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p force-systemd-flag-380631: (2.188353794s)
--- PASS: TestForceSystemdFlag (40.59s)

                                                
                                    
x
+
TestForceSystemdEnv (37.61s)

                                                
                                                
=== RUN   TestForceSystemdEnv
=== PAUSE TestForceSystemdEnv

                                                
                                                

                                                
                                                
=== CONT  TestForceSystemdEnv
docker_test.go:155: (dbg) Run:  out/minikube-linux-arm64 start -p force-systemd-env-924581 --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd
E1124 10:16:03.604285 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
docker_test.go:155: (dbg) Done: out/minikube-linux-arm64 start -p force-systemd-env-924581 --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd: (35.195896499s)
docker_test.go:121: (dbg) Run:  out/minikube-linux-arm64 -p force-systemd-env-924581 ssh "cat /etc/containerd/config.toml"
helpers_test.go:175: Cleaning up "force-systemd-env-924581" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p force-systemd-env-924581
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p force-systemd-env-924581: (2.106076853s)
--- PASS: TestForceSystemdEnv (37.61s)

                                                
                                    
x
+
TestErrorSpam/setup (32.87s)

                                                
                                                
=== RUN   TestErrorSpam/setup
error_spam_test.go:81: (dbg) Run:  out/minikube-linux-arm64 start -p nospam-290465 -n=1 --memory=3072 --wait=false --log_dir=/tmp/nospam-290465 --driver=docker  --container-runtime=containerd
error_spam_test.go:81: (dbg) Done: out/minikube-linux-arm64 start -p nospam-290465 -n=1 --memory=3072 --wait=false --log_dir=/tmp/nospam-290465 --driver=docker  --container-runtime=containerd: (32.871400668s)
--- PASS: TestErrorSpam/setup (32.87s)

                                                
                                    
x
+
TestErrorSpam/start (0.83s)

                                                
                                                
=== RUN   TestErrorSpam/start
error_spam_test.go:206: Cleaning up 1 logfile(s) ...
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-290465 --log_dir /tmp/nospam-290465 start --dry-run
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-290465 --log_dir /tmp/nospam-290465 start --dry-run
error_spam_test.go:172: (dbg) Run:  out/minikube-linux-arm64 -p nospam-290465 --log_dir /tmp/nospam-290465 start --dry-run
--- PASS: TestErrorSpam/start (0.83s)

                                                
                                    
x
+
TestErrorSpam/status (1.08s)

                                                
                                                
=== RUN   TestErrorSpam/status
error_spam_test.go:206: Cleaning up 0 logfile(s) ...
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-290465 --log_dir /tmp/nospam-290465 status
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-290465 --log_dir /tmp/nospam-290465 status
error_spam_test.go:172: (dbg) Run:  out/minikube-linux-arm64 -p nospam-290465 --log_dir /tmp/nospam-290465 status
--- PASS: TestErrorSpam/status (1.08s)

                                                
                                    
x
+
TestErrorSpam/pause (1.81s)

                                                
                                                
=== RUN   TestErrorSpam/pause
error_spam_test.go:206: Cleaning up 0 logfile(s) ...
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-290465 --log_dir /tmp/nospam-290465 pause
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-290465 --log_dir /tmp/nospam-290465 pause
error_spam_test.go:172: (dbg) Run:  out/minikube-linux-arm64 -p nospam-290465 --log_dir /tmp/nospam-290465 pause
--- PASS: TestErrorSpam/pause (1.81s)

                                                
                                    
x
+
TestErrorSpam/unpause (1.89s)

                                                
                                                
=== RUN   TestErrorSpam/unpause
error_spam_test.go:206: Cleaning up 0 logfile(s) ...
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-290465 --log_dir /tmp/nospam-290465 unpause
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-290465 --log_dir /tmp/nospam-290465 unpause
error_spam_test.go:172: (dbg) Run:  out/minikube-linux-arm64 -p nospam-290465 --log_dir /tmp/nospam-290465 unpause
--- PASS: TestErrorSpam/unpause (1.89s)

                                                
                                    
x
+
TestErrorSpam/stop (1.64s)

                                                
                                                
=== RUN   TestErrorSpam/stop
error_spam_test.go:206: Cleaning up 0 logfile(s) ...
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-290465 --log_dir /tmp/nospam-290465 stop
error_spam_test.go:149: (dbg) Done: out/minikube-linux-arm64 -p nospam-290465 --log_dir /tmp/nospam-290465 stop: (1.427704583s)
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-290465 --log_dir /tmp/nospam-290465 stop
error_spam_test.go:172: (dbg) Run:  out/minikube-linux-arm64 -p nospam-290465 --log_dir /tmp/nospam-290465 stop
--- PASS: TestErrorSpam/stop (1.64s)

                                                
                                    
x
+
TestFunctional/serial/CopySyncFile (0.01s)

                                                
                                                
=== RUN   TestFunctional/serial/CopySyncFile
functional_test.go:1860: local sync path: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/test/nested/copy/1654467/hosts
--- PASS: TestFunctional/serial/CopySyncFile (0.01s)

                                                
                                    
x
+
TestFunctional/serial/StartWithProxy (79.06s)

                                                
                                                
=== RUN   TestFunctional/serial/StartWithProxy
functional_test.go:2239: (dbg) Run:  out/minikube-linux-arm64 start -p functional-941011 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd
functional_test.go:2239: (dbg) Done: out/minikube-linux-arm64 start -p functional-941011 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd: (1m19.05522019s)
--- PASS: TestFunctional/serial/StartWithProxy (79.06s)

                                                
                                    
x
+
TestFunctional/serial/AuditLog (0s)

                                                
                                                
=== RUN   TestFunctional/serial/AuditLog
--- PASS: TestFunctional/serial/AuditLog (0.00s)

                                                
                                    
x
+
TestFunctional/serial/SoftStart (8.3s)

                                                
                                                
=== RUN   TestFunctional/serial/SoftStart
I1124 08:55:01.460821 1654467 config.go:182] Loaded profile config "functional-941011": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
functional_test.go:674: (dbg) Run:  out/minikube-linux-arm64 start -p functional-941011 --alsologtostderr -v=8
functional_test.go:674: (dbg) Done: out/minikube-linux-arm64 start -p functional-941011 --alsologtostderr -v=8: (8.300844072s)
functional_test.go:678: soft start took 8.302376726s for "functional-941011" cluster.
I1124 08:55:09.762074 1654467 config.go:182] Loaded profile config "functional-941011": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
--- PASS: TestFunctional/serial/SoftStart (8.30s)

                                                
                                    
x
+
TestFunctional/serial/KubeContext (0.06s)

                                                
                                                
=== RUN   TestFunctional/serial/KubeContext
functional_test.go:696: (dbg) Run:  kubectl config current-context
--- PASS: TestFunctional/serial/KubeContext (0.06s)

                                                
                                    
x
+
TestFunctional/serial/KubectlGetPods (0.11s)

                                                
                                                
=== RUN   TestFunctional/serial/KubectlGetPods
functional_test.go:711: (dbg) Run:  kubectl --context functional-941011 get po -A
--- PASS: TestFunctional/serial/KubectlGetPods (0.11s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/add_remote (3.45s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/add_remote
functional_test.go:1064: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 cache add registry.k8s.io/pause:3.1
functional_test.go:1064: (dbg) Done: out/minikube-linux-arm64 -p functional-941011 cache add registry.k8s.io/pause:3.1: (1.258862619s)
functional_test.go:1064: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 cache add registry.k8s.io/pause:3.3
functional_test.go:1064: (dbg) Done: out/minikube-linux-arm64 -p functional-941011 cache add registry.k8s.io/pause:3.3: (1.136721378s)
functional_test.go:1064: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 cache add registry.k8s.io/pause:latest
functional_test.go:1064: (dbg) Done: out/minikube-linux-arm64 -p functional-941011 cache add registry.k8s.io/pause:latest: (1.052198165s)
--- PASS: TestFunctional/serial/CacheCmd/cache/add_remote (3.45s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/add_local (1.3s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/add_local
functional_test.go:1092: (dbg) Run:  docker build -t minikube-local-cache-test:functional-941011 /tmp/TestFunctionalserialCacheCmdcacheadd_local378440659/001
functional_test.go:1104: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 cache add minikube-local-cache-test:functional-941011
functional_test.go:1109: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 cache delete minikube-local-cache-test:functional-941011
functional_test.go:1098: (dbg) Run:  docker rmi minikube-local-cache-test:functional-941011
--- PASS: TestFunctional/serial/CacheCmd/cache/add_local (1.30s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/CacheDelete (0.06s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/CacheDelete
functional_test.go:1117: (dbg) Run:  out/minikube-linux-arm64 cache delete registry.k8s.io/pause:3.3
--- PASS: TestFunctional/serial/CacheCmd/cache/CacheDelete (0.06s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/list (0.06s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/list
functional_test.go:1125: (dbg) Run:  out/minikube-linux-arm64 cache list
--- PASS: TestFunctional/serial/CacheCmd/cache/list (0.06s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node (0.34s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node
functional_test.go:1139: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 ssh sudo crictl images
--- PASS: TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node (0.34s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/cache_reload (1.87s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/cache_reload
functional_test.go:1162: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 ssh sudo crictl rmi registry.k8s.io/pause:latest
functional_test.go:1168: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 ssh sudo crictl inspecti registry.k8s.io/pause:latest
functional_test.go:1168: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-941011 ssh sudo crictl inspecti registry.k8s.io/pause:latest: exit status 1 (298.158212ms)

                                                
                                                
-- stdout --
	FATA[0000] no such image "registry.k8s.io/pause:latest" present 

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:1173: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 cache reload
functional_test.go:1178: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 ssh sudo crictl inspecti registry.k8s.io/pause:latest
--- PASS: TestFunctional/serial/CacheCmd/cache/cache_reload (1.87s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/delete (0.13s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/delete
functional_test.go:1187: (dbg) Run:  out/minikube-linux-arm64 cache delete registry.k8s.io/pause:3.1
functional_test.go:1187: (dbg) Run:  out/minikube-linux-arm64 cache delete registry.k8s.io/pause:latest
--- PASS: TestFunctional/serial/CacheCmd/cache/delete (0.13s)

                                                
                                    
x
+
TestFunctional/serial/MinikubeKubectlCmd (0.14s)

                                                
                                                
=== RUN   TestFunctional/serial/MinikubeKubectlCmd
functional_test.go:731: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 kubectl -- --context functional-941011 get pods
--- PASS: TestFunctional/serial/MinikubeKubectlCmd (0.14s)

                                                
                                    
x
+
TestFunctional/serial/MinikubeKubectlCmdDirectly (0.14s)

                                                
                                                
=== RUN   TestFunctional/serial/MinikubeKubectlCmdDirectly
functional_test.go:756: (dbg) Run:  out/kubectl --context functional-941011 get pods
--- PASS: TestFunctional/serial/MinikubeKubectlCmdDirectly (0.14s)

                                                
                                    
x
+
TestFunctional/serial/ExtraConfig (54.57s)

                                                
                                                
=== RUN   TestFunctional/serial/ExtraConfig
functional_test.go:772: (dbg) Run:  out/minikube-linux-arm64 start -p functional-941011 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all
E1124 08:56:03.607918 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 08:56:03.614342 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 08:56:03.625826 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 08:56:03.647232 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 08:56:03.688672 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 08:56:03.770103 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 08:56:03.931697 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 08:56:04.253434 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 08:56:04.895642 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 08:56:06.177390 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 08:56:08.740333 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
functional_test.go:772: (dbg) Done: out/minikube-linux-arm64 start -p functional-941011 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all: (54.565945038s)
functional_test.go:776: restart took 54.566050942s for "functional-941011" cluster.
I1124 08:56:11.989206 1654467 config.go:182] Loaded profile config "functional-941011": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
--- PASS: TestFunctional/serial/ExtraConfig (54.57s)

                                                
                                    
x
+
TestFunctional/serial/ComponentHealth (0.11s)

                                                
                                                
=== RUN   TestFunctional/serial/ComponentHealth
functional_test.go:825: (dbg) Run:  kubectl --context functional-941011 get po -l tier=control-plane -n kube-system -o=json
functional_test.go:840: etcd phase: Running
functional_test.go:850: etcd status: Ready
functional_test.go:840: kube-apiserver phase: Running
functional_test.go:850: kube-apiserver status: Ready
functional_test.go:840: kube-controller-manager phase: Running
functional_test.go:850: kube-controller-manager status: Ready
functional_test.go:840: kube-scheduler phase: Running
functional_test.go:850: kube-scheduler status: Ready
--- PASS: TestFunctional/serial/ComponentHealth (0.11s)

                                                
                                    
x
+
TestFunctional/serial/LogsCmd (1.49s)

                                                
                                                
=== RUN   TestFunctional/serial/LogsCmd
functional_test.go:1251: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 logs
functional_test.go:1251: (dbg) Done: out/minikube-linux-arm64 -p functional-941011 logs: (1.490248267s)
--- PASS: TestFunctional/serial/LogsCmd (1.49s)

                                                
                                    
x
+
TestFunctional/serial/LogsFileCmd (1.53s)

                                                
                                                
=== RUN   TestFunctional/serial/LogsFileCmd
functional_test.go:1265: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 logs --file /tmp/TestFunctionalserialLogsFileCmd447799549/001/logs.txt
E1124 08:56:13.861709 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
functional_test.go:1265: (dbg) Done: out/minikube-linux-arm64 -p functional-941011 logs --file /tmp/TestFunctionalserialLogsFileCmd447799549/001/logs.txt: (1.529334877s)
--- PASS: TestFunctional/serial/LogsFileCmd (1.53s)

                                                
                                    
x
+
TestFunctional/serial/InvalidService (4.5s)

                                                
                                                
=== RUN   TestFunctional/serial/InvalidService
functional_test.go:2326: (dbg) Run:  kubectl --context functional-941011 apply -f testdata/invalidsvc.yaml
functional_test.go:2340: (dbg) Run:  out/minikube-linux-arm64 service invalid-svc -p functional-941011
functional_test.go:2340: (dbg) Non-zero exit: out/minikube-linux-arm64 service invalid-svc -p functional-941011: exit status 115 (412.32508ms)

                                                
                                                
-- stdout --
	┌───────────┬─────────────┬─────────────┬───────────────────────────┐
	│ NAMESPACE │    NAME     │ TARGET PORT │            URL            │
	├───────────┼─────────────┼─────────────┼───────────────────────────┤
	│ default   │ invalid-svc │ 80          │ http://192.168.49.2:30276 │
	└───────────┴─────────────┴─────────────┴───────────────────────────┘
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to SVC_UNREACHABLE: service not available: no running pod for service invalid-svc found
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_service_96b204199e3191fa1740d4430b018a3c8028d52d_0.log                 │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
functional_test.go:2332: (dbg) Run:  kubectl --context functional-941011 delete -f testdata/invalidsvc.yaml
--- PASS: TestFunctional/serial/InvalidService (4.50s)

                                                
                                    
x
+
TestFunctional/parallel/ConfigCmd (0.51s)

                                                
                                                
=== RUN   TestFunctional/parallel/ConfigCmd
=== PAUSE TestFunctional/parallel/ConfigCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ConfigCmd
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 config unset cpus
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 config get cpus
functional_test.go:1214: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-941011 config get cpus: exit status 14 (95.804909ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 config set cpus 2
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 config get cpus
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 config unset cpus
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 config get cpus
functional_test.go:1214: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-941011 config get cpus: exit status 14 (84.632483ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/ConfigCmd (0.51s)

                                                
                                    
x
+
TestFunctional/parallel/DryRun (0.47s)

                                                
                                                
=== RUN   TestFunctional/parallel/DryRun
=== PAUSE TestFunctional/parallel/DryRun

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DryRun
functional_test.go:989: (dbg) Run:  out/minikube-linux-arm64 start -p functional-941011 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd
functional_test.go:989: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-941011 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd: exit status 23 (206.043626ms)

                                                
                                                
-- stdout --
	* [functional-941011] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=21978
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/21978-1652607/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/21978-1652607/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the docker driver based on existing profile
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1124 09:02:10.489319 1692325 out.go:360] Setting OutFile to fd 1 ...
	I1124 09:02:10.489435 1692325 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:02:10.489441 1692325 out.go:374] Setting ErrFile to fd 2...
	I1124 09:02:10.489445 1692325 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:02:10.491595 1692325 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
	I1124 09:02:10.492043 1692325 out.go:368] Setting JSON to false
	I1124 09:02:10.493058 1692325 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":27860,"bootTime":1763947071,"procs":190,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1124 09:02:10.493133 1692325 start.go:143] virtualization:  
	I1124 09:02:10.498094 1692325 out.go:179] * [functional-941011] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1124 09:02:10.502016 1692325 out.go:179]   - MINIKUBE_LOCATION=21978
	I1124 09:02:10.502094 1692325 notify.go:221] Checking for updates...
	I1124 09:02:10.507859 1692325 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1124 09:02:10.510777 1692325 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 09:02:10.513728 1692325 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21978-1652607/.minikube
	I1124 09:02:10.516565 1692325 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1124 09:02:10.519424 1692325 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1124 09:02:10.522802 1692325 config.go:182] Loaded profile config "functional-941011": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
	I1124 09:02:10.523440 1692325 driver.go:422] Setting default libvirt URI to qemu:///system
	I1124 09:02:10.553022 1692325 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1124 09:02:10.553183 1692325 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 09:02:10.620207 1692325 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-11-24 09:02:10.610388346 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 09:02:10.620305 1692325 docker.go:319] overlay module found
	I1124 09:02:10.623441 1692325 out.go:179] * Using the docker driver based on existing profile
	I1124 09:02:10.626259 1692325 start.go:309] selected driver: docker
	I1124 09:02:10.626286 1692325 start.go:927] validating driver "docker" against &{Name:functional-941011 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.2 ClusterName:functional-941011 Namespace:default APIServerHAVIP: APIServerNa
me:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOpt
ions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:02:10.626392 1692325 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1124 09:02:10.629963 1692325 out.go:203] 
	W1124 09:02:10.632887 1692325 out.go:285] X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	I1124 09:02:10.635802 1692325 out.go:203] 

                                                
                                                
** /stderr **
functional_test.go:1006: (dbg) Run:  out/minikube-linux-arm64 start -p functional-941011 --dry-run --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd
--- PASS: TestFunctional/parallel/DryRun (0.47s)

                                                
                                    
x
+
TestFunctional/parallel/InternationalLanguage (0.21s)

                                                
                                                
=== RUN   TestFunctional/parallel/InternationalLanguage
=== PAUSE TestFunctional/parallel/InternationalLanguage

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/InternationalLanguage
functional_test.go:1035: (dbg) Run:  out/minikube-linux-arm64 start -p functional-941011 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd
functional_test.go:1035: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-941011 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd: exit status 23 (208.805782ms)

                                                
                                                
-- stdout --
	* [functional-941011] minikube v1.37.0 sur Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=21978
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/21978-1652607/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/21978-1652607/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Utilisation du pilote docker basé sur le profil existant
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1124 09:02:10.965378 1692444 out.go:360] Setting OutFile to fd 1 ...
	I1124 09:02:10.965542 1692444 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:02:10.965568 1692444 out.go:374] Setting ErrFile to fd 2...
	I1124 09:02:10.965585 1692444 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:02:10.965987 1692444 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
	I1124 09:02:10.966438 1692444 out.go:368] Setting JSON to false
	I1124 09:02:10.967543 1692444 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":27860,"bootTime":1763947071,"procs":190,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1124 09:02:10.967616 1692444 start.go:143] virtualization:  
	I1124 09:02:10.970861 1692444 out.go:179] * [functional-941011] minikube v1.37.0 sur Ubuntu 20.04 (arm64)
	I1124 09:02:10.974642 1692444 out.go:179]   - MINIKUBE_LOCATION=21978
	I1124 09:02:10.974681 1692444 notify.go:221] Checking for updates...
	I1124 09:02:10.980472 1692444 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1124 09:02:10.983548 1692444 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 09:02:10.986537 1692444 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21978-1652607/.minikube
	I1124 09:02:10.989522 1692444 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1124 09:02:10.992492 1692444 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1124 09:02:10.996012 1692444 config.go:182] Loaded profile config "functional-941011": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
	I1124 09:02:10.999624 1692444 driver.go:422] Setting default libvirt URI to qemu:///system
	I1124 09:02:11.026708 1692444 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1124 09:02:11.026833 1692444 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 09:02:11.090005 1692444 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-11-24 09:02:11.080326924 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 09:02:11.090124 1692444 docker.go:319] overlay module found
	I1124 09:02:11.093275 1692444 out.go:179] * Utilisation du pilote docker basé sur le profil existant
	I1124 09:02:11.095997 1692444 start.go:309] selected driver: docker
	I1124 09:02:11.096021 1692444 start.go:927] validating driver "docker" against &{Name:functional-941011 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.2 ClusterName:functional-941011 Namespace:default APIServerHAVIP: APIServerNa
me:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOpt
ions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:02:11.096132 1692444 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1124 09:02:11.099708 1692444 out.go:203] 
	W1124 09:02:11.102517 1692444 out.go:285] X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	I1124 09:02:11.105331 1692444 out.go:203] 

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/InternationalLanguage (0.21s)

                                                
                                    
x
+
TestFunctional/parallel/StatusCmd (1.07s)

                                                
                                                
=== RUN   TestFunctional/parallel/StatusCmd
=== PAUSE TestFunctional/parallel/StatusCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/StatusCmd
functional_test.go:869: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 status
functional_test.go:875: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 status -f host:{{.Host}},kublet:{{.Kubelet}},apiserver:{{.APIServer}},kubeconfig:{{.Kubeconfig}}
functional_test.go:887: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 status -o json
--- PASS: TestFunctional/parallel/StatusCmd (1.07s)

                                                
                                    
x
+
TestFunctional/parallel/AddonsCmd (0.15s)

                                                
                                                
=== RUN   TestFunctional/parallel/AddonsCmd
=== PAUSE TestFunctional/parallel/AddonsCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/AddonsCmd
functional_test.go:1695: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 addons list
functional_test.go:1707: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 addons list -o json
--- PASS: TestFunctional/parallel/AddonsCmd (0.15s)

                                                
                                    
x
+
TestFunctional/parallel/SSHCmd (0.59s)

                                                
                                                
=== RUN   TestFunctional/parallel/SSHCmd
=== PAUSE TestFunctional/parallel/SSHCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/SSHCmd
functional_test.go:1730: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 ssh "echo hello"
functional_test.go:1747: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 ssh "cat /etc/hostname"
--- PASS: TestFunctional/parallel/SSHCmd (0.59s)

                                                
                                    
x
+
TestFunctional/parallel/CpCmd (2.1s)

                                                
                                                
=== RUN   TestFunctional/parallel/CpCmd
=== PAUSE TestFunctional/parallel/CpCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/CpCmd
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 cp testdata/cp-test.txt /home/docker/cp-test.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 ssh -n functional-941011 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 cp functional-941011:/home/docker/cp-test.txt /tmp/TestFunctionalparallelCpCmd1275226348/001/cp-test.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 ssh -n functional-941011 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 cp testdata/cp-test.txt /tmp/does/not/exist/cp-test.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 ssh -n functional-941011 "sudo cat /tmp/does/not/exist/cp-test.txt"
--- PASS: TestFunctional/parallel/CpCmd (2.10s)

                                                
                                    
x
+
TestFunctional/parallel/FileSync (0.36s)

                                                
                                                
=== RUN   TestFunctional/parallel/FileSync
=== PAUSE TestFunctional/parallel/FileSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/FileSync
functional_test.go:1934: Checking for existence of /etc/test/nested/copy/1654467/hosts within VM
functional_test.go:1936: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 ssh "sudo cat /etc/test/nested/copy/1654467/hosts"
E1124 08:56:24.104743 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
functional_test.go:1941: file sync test content: Test file for checking file sync process
--- PASS: TestFunctional/parallel/FileSync (0.36s)

                                                
                                    
x
+
TestFunctional/parallel/CertSync (2.26s)

                                                
                                                
=== RUN   TestFunctional/parallel/CertSync
=== PAUSE TestFunctional/parallel/CertSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/CertSync
functional_test.go:1977: Checking for existence of /etc/ssl/certs/1654467.pem within VM
functional_test.go:1978: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 ssh "sudo cat /etc/ssl/certs/1654467.pem"
functional_test.go:1977: Checking for existence of /usr/share/ca-certificates/1654467.pem within VM
functional_test.go:1978: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 ssh "sudo cat /usr/share/ca-certificates/1654467.pem"
functional_test.go:1977: Checking for existence of /etc/ssl/certs/51391683.0 within VM
functional_test.go:1978: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 ssh "sudo cat /etc/ssl/certs/51391683.0"
functional_test.go:2004: Checking for existence of /etc/ssl/certs/16544672.pem within VM
functional_test.go:2005: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 ssh "sudo cat /etc/ssl/certs/16544672.pem"
functional_test.go:2004: Checking for existence of /usr/share/ca-certificates/16544672.pem within VM
functional_test.go:2005: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 ssh "sudo cat /usr/share/ca-certificates/16544672.pem"
functional_test.go:2004: Checking for existence of /etc/ssl/certs/3ec20f2e.0 within VM
functional_test.go:2005: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 ssh "sudo cat /etc/ssl/certs/3ec20f2e.0"
--- PASS: TestFunctional/parallel/CertSync (2.26s)

                                                
                                    
x
+
TestFunctional/parallel/NodeLabels (0.11s)

                                                
                                                
=== RUN   TestFunctional/parallel/NodeLabels
=== PAUSE TestFunctional/parallel/NodeLabels

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/NodeLabels
functional_test.go:234: (dbg) Run:  kubectl --context functional-941011 get nodes --output=go-template "--template='{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'"
--- PASS: TestFunctional/parallel/NodeLabels (0.11s)

                                                
                                    
x
+
TestFunctional/parallel/NonActiveRuntimeDisabled (0.74s)

                                                
                                                
=== RUN   TestFunctional/parallel/NonActiveRuntimeDisabled
=== PAUSE TestFunctional/parallel/NonActiveRuntimeDisabled

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/NonActiveRuntimeDisabled
functional_test.go:2032: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 ssh "sudo systemctl is-active docker"
functional_test.go:2032: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-941011 ssh "sudo systemctl is-active docker": exit status 1 (382.642973ms)

                                                
                                                
-- stdout --
	inactive

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
functional_test.go:2032: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 ssh "sudo systemctl is-active crio"
functional_test.go:2032: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-941011 ssh "sudo systemctl is-active crio": exit status 1 (356.211943ms)

                                                
                                                
-- stdout --
	inactive

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/NonActiveRuntimeDisabled (0.74s)

                                                
                                    
x
+
TestFunctional/parallel/License (0.36s)

                                                
                                                
=== RUN   TestFunctional/parallel/License
=== PAUSE TestFunctional/parallel/License

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/License
functional_test.go:2293: (dbg) Run:  out/minikube-linux-arm64 license
--- PASS: TestFunctional/parallel/License (0.36s)

                                                
                                    
x
+
TestFunctional/parallel/Version/short (0.06s)

                                                
                                                
=== RUN   TestFunctional/parallel/Version/short
=== PAUSE TestFunctional/parallel/Version/short

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/Version/short
functional_test.go:2261: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 version --short
--- PASS: TestFunctional/parallel/Version/short (0.06s)

                                                
                                    
x
+
TestFunctional/parallel/Version/components (1.19s)

                                                
                                                
=== RUN   TestFunctional/parallel/Version/components
=== PAUSE TestFunctional/parallel/Version/components

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/Version/components
functional_test.go:2275: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 version -o=json --components
functional_test.go:2275: (dbg) Done: out/minikube-linux-arm64 -p functional-941011 version -o=json --components: (1.1894617s)
--- PASS: TestFunctional/parallel/Version/components (1.19s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListShort (0.22s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListShort
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListShort

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListShort
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 image ls --format short --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-941011 image ls --format short --alsologtostderr:
registry.k8s.io/pause:latest
registry.k8s.io/pause:3.3
registry.k8s.io/pause:3.10.1
registry.k8s.io/pause:3.1
registry.k8s.io/kube-scheduler:v1.34.2
registry.k8s.io/kube-proxy:v1.34.2
registry.k8s.io/kube-controller-manager:v1.34.2
registry.k8s.io/kube-apiserver:v1.34.2
registry.k8s.io/etcd:3.6.5-0
registry.k8s.io/coredns/coredns:v1.12.1
gcr.io/k8s-minikube/storage-provisioner:v5
gcr.io/k8s-minikube/busybox:1.28.4-glibc
docker.io/library/minikube-local-cache-test:functional-941011
docker.io/kindest/kindnetd:v20250512-df8de77b
docker.io/kicbase/echo-server:functional-941011
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-941011 image ls --format short --alsologtostderr:
I1124 09:07:15.541005 1693519 out.go:360] Setting OutFile to fd 1 ...
I1124 09:07:15.541206 1693519 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1124 09:07:15.541218 1693519 out.go:374] Setting ErrFile to fd 2...
I1124 09:07:15.541223 1693519 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1124 09:07:15.541474 1693519 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
I1124 09:07:15.542079 1693519 config.go:182] Loaded profile config "functional-941011": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
I1124 09:07:15.542237 1693519 config.go:182] Loaded profile config "functional-941011": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
I1124 09:07:15.542812 1693519 cli_runner.go:164] Run: docker container inspect functional-941011 --format={{.State.Status}}
I1124 09:07:15.559642 1693519 ssh_runner.go:195] Run: systemctl --version
I1124 09:07:15.559716 1693519 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-941011
I1124 09:07:15.577858 1693519 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34679 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-941011/id_rsa Username:docker}
I1124 09:07:15.686040 1693519 ssh_runner.go:195] Run: sudo crictl images --output json
--- PASS: TestFunctional/parallel/ImageCommands/ImageListShort (0.22s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListTable (0.23s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListTable
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListTable

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListTable
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 image ls --format table --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-941011 image ls --format table --alsologtostderr:
┌─────────────────────────────────────────────┬────────────────────┬───────────────┬────────┐
│                    IMAGE                    │        TAG         │   IMAGE ID    │  SIZE  │
├─────────────────────────────────────────────┼────────────────────┼───────────────┼────────┤
│ docker.io/library/minikube-local-cache-test │ functional-941011  │ sha256:cee23f │ 991B   │
│ localhost/my-image                          │ functional-941011  │ sha256:0dd6d7 │ 831kB  │
│ registry.k8s.io/etcd                        │ 3.6.5-0            │ sha256:2c5f0d │ 21.1MB │
│ docker.io/kindest/kindnetd                  │ v20250512-df8de77b │ sha256:b1a8c6 │ 40.6MB │
│ registry.k8s.io/coredns/coredns             │ v1.12.1            │ sha256:138784 │ 20.4MB │
│ registry.k8s.io/kube-scheduler              │ v1.34.2            │ sha256:4f982e │ 15.8MB │
│ registry.k8s.io/pause                       │ 3.1                │ sha256:8057e0 │ 262kB  │
│ registry.k8s.io/pause                       │ 3.3                │ sha256:3d1873 │ 249kB  │
│ registry.k8s.io/pause                       │ latest             │ sha256:8cb209 │ 71.3kB │
│ registry.k8s.io/kube-controller-manager     │ v1.34.2            │ sha256:1b3491 │ 20.7MB │
│ registry.k8s.io/kube-proxy                  │ v1.34.2            │ sha256:94bff1 │ 22.8MB │
│ registry.k8s.io/pause                       │ 3.10.1             │ sha256:d7b100 │ 268kB  │
│ docker.io/kicbase/echo-server               │ functional-941011  │ sha256:ce2d2c │ 2.17MB │
│ gcr.io/k8s-minikube/busybox                 │ 1.28.4-glibc       │ sha256:1611cd │ 1.94MB │
│ gcr.io/k8s-minikube/storage-provisioner     │ v5                 │ sha256:ba04bb │ 8.03MB │
│ registry.k8s.io/kube-apiserver              │ v1.34.2            │ sha256:b178af │ 24.6MB │
└─────────────────────────────────────────────┴────────────────────┴───────────────┴────────┘
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-941011 image ls --format table --alsologtostderr:
I1124 09:07:19.727043 1693884 out.go:360] Setting OutFile to fd 1 ...
I1124 09:07:19.727219 1693884 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1124 09:07:19.727248 1693884 out.go:374] Setting ErrFile to fd 2...
I1124 09:07:19.727269 1693884 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1124 09:07:19.727532 1693884 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
I1124 09:07:19.728159 1693884 config.go:182] Loaded profile config "functional-941011": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
I1124 09:07:19.728333 1693884 config.go:182] Loaded profile config "functional-941011": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
I1124 09:07:19.728919 1693884 cli_runner.go:164] Run: docker container inspect functional-941011 --format={{.State.Status}}
I1124 09:07:19.746725 1693884 ssh_runner.go:195] Run: systemctl --version
I1124 09:07:19.746788 1693884 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-941011
I1124 09:07:19.763499 1693884 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34679 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-941011/id_rsa Username:docker}
I1124 09:07:19.869101 1693884 ssh_runner.go:195] Run: sudo crictl images --output json
--- PASS: TestFunctional/parallel/ImageCommands/ImageListTable (0.23s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListJson (0.23s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListJson
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListJson

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListJson
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 image ls --format json --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-941011 image ls --format json --alsologtostderr:
[{"id":"sha256:1611cd07b61d57dbbfebe6db242513fd51e1c02d20ba08af17a45837d86a8a8c","repoDigests":["gcr.io/k8s-minikube/busybox@sha256:2d03e6ceeb99250061dd110530b0ece7998cd84121f952adef120ea7c5a6f00e"],"repoTags":["gcr.io/k8s-minikube/busybox:1.28.4-glibc"],"size":"1935750"},{"id":"sha256:0dd6d757450a764d00308d5c72c625f64f21a54f2f9c75957a4660b66eb329a5","repoDigests":[],"repoTags":["localhost/my-image:functional-941011"],"size":"830616"},{"id":"sha256:8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.1"],"size":"262191"},{"id":"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17","repoDigests":[],"repoTags":["docker.io/kicbase/echo-server:functional-941011"],"size":"2173567"},{"id":"sha256:b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c","repoDigests":["docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a"],"repoTags":["docker.io/kindest/kindnetd:v202505
12-df8de77b"],"size":"40636774"},{"id":"sha256:ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6","repoDigests":["gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"],"repoTags":["gcr.io/k8s-minikube/storage-provisioner:v5"],"size":"8034419"},{"id":"sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42","repoDigests":["registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534"],"repoTags":["registry.k8s.io/etcd:3.6.5-0"],"size":"21136588"},{"id":"sha256:1b34917560f0916ad0d1e98debeaf98c640b68c5a38f6d87711f0e288e5d7be2","repoDigests":["registry.k8s.io/kube-controller-manager@sha256:5c3998664b77441c09a4604f1361b230e63f7a6f299fc02fc1ebd1a12c38e3eb"],"repoTags":["registry.k8s.io/kube-controller-manager:v1.34.2"],"size":"20718696"},{"id":"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd","repoDigests":["registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e9
6f23de604a3a08cc3c80002767d24c"],"repoTags":["registry.k8s.io/pause:3.10.1"],"size":"267939"},{"id":"sha256:3d18732f8686cc3c878055d99a05fa80289502fa496b36b6a0fe0f77206a7300","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.3"],"size":"249461"},{"id":"sha256:cee23f3226e286bed41a2ff2b5fad8e6e395a2934880a19d2a902b07140a4221","repoDigests":[],"repoTags":["docker.io/library/minikube-local-cache-test:functional-941011"],"size":"991"},{"id":"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc","repoDigests":["registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c"],"repoTags":["registry.k8s.io/coredns/coredns:v1.12.1"],"size":"20392204"},{"id":"sha256:4f982e73e768a6ccebb54f8905b83b78d56b3a014e709c0bfe77140db3543949","repoDigests":["registry.k8s.io/kube-scheduler@sha256:44229946c0966b07d5c0791681d803e77258949985e49b4ab0fbdff99d2a48c6"],"repoTags":["registry.k8s.io/kube-scheduler:v1.34.2"],"size":"15775785"},{"id":"sha256:b178af3d91f80925cd8bec4
2e1813e7d46370236a811d3380c9c10a02b245ca7","repoDigests":["registry.k8s.io/kube-apiserver@sha256:e009ef63deaf797763b5bd423d04a099a2fe414a081bf7d216b43bc9e76b9077"],"repoTags":["registry.k8s.io/kube-apiserver:v1.34.2"],"size":"24559643"},{"id":"sha256:94bff1bec29fd04573941f362e44a6730b151d46df215613feb3f1167703f786","repoDigests":["registry.k8s.io/kube-proxy@sha256:d8b843ac8a5e861238df24a4db8c2ddced89948633400c4660464472045276f5"],"repoTags":["registry.k8s.io/kube-proxy:v1.34.2"],"size":"22802260"},{"id":"sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a","repoDigests":[],"repoTags":["registry.k8s.io/pause:latest"],"size":"71300"}]
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-941011 image ls --format json --alsologtostderr:
I1124 09:07:19.492734 1693848 out.go:360] Setting OutFile to fd 1 ...
I1124 09:07:19.492924 1693848 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1124 09:07:19.492938 1693848 out.go:374] Setting ErrFile to fd 2...
I1124 09:07:19.492944 1693848 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1124 09:07:19.493226 1693848 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
I1124 09:07:19.493880 1693848 config.go:182] Loaded profile config "functional-941011": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
I1124 09:07:19.494039 1693848 config.go:182] Loaded profile config "functional-941011": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
I1124 09:07:19.494702 1693848 cli_runner.go:164] Run: docker container inspect functional-941011 --format={{.State.Status}}
I1124 09:07:19.512347 1693848 ssh_runner.go:195] Run: systemctl --version
I1124 09:07:19.512407 1693848 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-941011
I1124 09:07:19.530847 1693848 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34679 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-941011/id_rsa Username:docker}
I1124 09:07:19.639470 1693848 ssh_runner.go:195] Run: sudo crictl images --output json
--- PASS: TestFunctional/parallel/ImageCommands/ImageListJson (0.23s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListYaml (0.23s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListYaml
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListYaml

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListYaml
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 image ls --format yaml --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-941011 image ls --format yaml --alsologtostderr:
- id: sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd
repoDigests:
- registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c
repoTags:
- registry.k8s.io/pause:3.10.1
size: "267939"
- id: sha256:cee23f3226e286bed41a2ff2b5fad8e6e395a2934880a19d2a902b07140a4221
repoDigests: []
repoTags:
- docker.io/library/minikube-local-cache-test:functional-941011
size: "991"
- id: sha256:ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6
repoDigests:
- gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944
repoTags:
- gcr.io/k8s-minikube/storage-provisioner:v5
size: "8034419"
- id: sha256:1b34917560f0916ad0d1e98debeaf98c640b68c5a38f6d87711f0e288e5d7be2
repoDigests:
- registry.k8s.io/kube-controller-manager@sha256:5c3998664b77441c09a4604f1361b230e63f7a6f299fc02fc1ebd1a12c38e3eb
repoTags:
- registry.k8s.io/kube-controller-manager:v1.34.2
size: "20718696"
- id: sha256:3d18732f8686cc3c878055d99a05fa80289502fa496b36b6a0fe0f77206a7300
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.3
size: "249461"
- id: sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17
repoDigests: []
repoTags:
- docker.io/kicbase/echo-server:functional-941011
size: "2173567"
- id: sha256:b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c
repoDigests:
- docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a
repoTags:
- docker.io/kindest/kindnetd:v20250512-df8de77b
size: "40636774"
- id: sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42
repoDigests:
- registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534
repoTags:
- registry.k8s.io/etcd:3.6.5-0
size: "21136588"
- id: sha256:b178af3d91f80925cd8bec42e1813e7d46370236a811d3380c9c10a02b245ca7
repoDigests:
- registry.k8s.io/kube-apiserver@sha256:e009ef63deaf797763b5bd423d04a099a2fe414a081bf7d216b43bc9e76b9077
repoTags:
- registry.k8s.io/kube-apiserver:v1.34.2
size: "24559643"
- id: sha256:8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.1
size: "262191"
- id: sha256:1611cd07b61d57dbbfebe6db242513fd51e1c02d20ba08af17a45837d86a8a8c
repoDigests:
- gcr.io/k8s-minikube/busybox@sha256:2d03e6ceeb99250061dd110530b0ece7998cd84121f952adef120ea7c5a6f00e
repoTags:
- gcr.io/k8s-minikube/busybox:1.28.4-glibc
size: "1935750"
- id: sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc
repoDigests:
- registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c
repoTags:
- registry.k8s.io/coredns/coredns:v1.12.1
size: "20392204"
- id: sha256:94bff1bec29fd04573941f362e44a6730b151d46df215613feb3f1167703f786
repoDigests:
- registry.k8s.io/kube-proxy@sha256:d8b843ac8a5e861238df24a4db8c2ddced89948633400c4660464472045276f5
repoTags:
- registry.k8s.io/kube-proxy:v1.34.2
size: "22802260"
- id: sha256:4f982e73e768a6ccebb54f8905b83b78d56b3a014e709c0bfe77140db3543949
repoDigests:
- registry.k8s.io/kube-scheduler@sha256:44229946c0966b07d5c0791681d803e77258949985e49b4ab0fbdff99d2a48c6
repoTags:
- registry.k8s.io/kube-scheduler:v1.34.2
size: "15775785"
- id: sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a
repoDigests: []
repoTags:
- registry.k8s.io/pause:latest
size: "71300"

                                                
                                                
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-941011 image ls --format yaml --alsologtostderr:
I1124 09:07:15.769122 1693555 out.go:360] Setting OutFile to fd 1 ...
I1124 09:07:15.769306 1693555 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1124 09:07:15.769335 1693555 out.go:374] Setting ErrFile to fd 2...
I1124 09:07:15.769357 1693555 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1124 09:07:15.769662 1693555 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
I1124 09:07:15.770336 1693555 config.go:182] Loaded profile config "functional-941011": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
I1124 09:07:15.770522 1693555 config.go:182] Loaded profile config "functional-941011": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
I1124 09:07:15.771175 1693555 cli_runner.go:164] Run: docker container inspect functional-941011 --format={{.State.Status}}
I1124 09:07:15.788496 1693555 ssh_runner.go:195] Run: systemctl --version
I1124 09:07:15.788558 1693555 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-941011
I1124 09:07:15.806798 1693555 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34679 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-941011/id_rsa Username:docker}
I1124 09:07:15.913977 1693555 ssh_runner.go:195] Run: sudo crictl images --output json
--- PASS: TestFunctional/parallel/ImageCommands/ImageListYaml (0.23s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageBuild (3.49s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageBuild
=== PAUSE TestFunctional/parallel/ImageCommands/ImageBuild

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageBuild
functional_test.go:323: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 ssh pgrep buildkitd
functional_test.go:323: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-941011 ssh pgrep buildkitd: exit status 1 (277.292858ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:330: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 image build -t localhost/my-image:functional-941011 testdata/build --alsologtostderr
functional_test.go:330: (dbg) Done: out/minikube-linux-arm64 -p functional-941011 image build -t localhost/my-image:functional-941011 testdata/build --alsologtostderr: (2.977770155s)
functional_test.go:338: (dbg) Stderr: out/minikube-linux-arm64 -p functional-941011 image build -t localhost/my-image:functional-941011 testdata/build --alsologtostderr:
I1124 09:07:16.276133 1693654 out.go:360] Setting OutFile to fd 1 ...
I1124 09:07:16.278072 1693654 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1124 09:07:16.278134 1693654 out.go:374] Setting ErrFile to fd 2...
I1124 09:07:16.278157 1693654 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1124 09:07:16.278512 1693654 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
I1124 09:07:16.279240 1693654 config.go:182] Loaded profile config "functional-941011": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
I1124 09:07:16.282605 1693654 config.go:182] Loaded profile config "functional-941011": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
I1124 09:07:16.283280 1693654 cli_runner.go:164] Run: docker container inspect functional-941011 --format={{.State.Status}}
I1124 09:07:16.300253 1693654 ssh_runner.go:195] Run: systemctl --version
I1124 09:07:16.300318 1693654 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-941011
I1124 09:07:16.316655 1693654 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34679 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-941011/id_rsa Username:docker}
I1124 09:07:16.421087 1693654 build_images.go:162] Building image from path: /tmp/build.3157104403.tar
I1124 09:07:16.421187 1693654 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/build
I1124 09:07:16.429107 1693654 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/build/build.3157104403.tar
I1124 09:07:16.432783 1693654 ssh_runner.go:352] existence check for /var/lib/minikube/build/build.3157104403.tar: stat -c "%s %y" /var/lib/minikube/build/build.3157104403.tar: Process exited with status 1
stdout:

                                                
                                                
stderr:
stat: cannot statx '/var/lib/minikube/build/build.3157104403.tar': No such file or directory
I1124 09:07:16.432817 1693654 ssh_runner.go:362] scp /tmp/build.3157104403.tar --> /var/lib/minikube/build/build.3157104403.tar (3072 bytes)
I1124 09:07:16.451087 1693654 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/build/build.3157104403
I1124 09:07:16.459324 1693654 ssh_runner.go:195] Run: sudo tar -C /var/lib/minikube/build/build.3157104403 -xf /var/lib/minikube/build/build.3157104403.tar
I1124 09:07:16.467369 1693654 containerd.go:394] Building image: /var/lib/minikube/build/build.3157104403
I1124 09:07:16.467459 1693654 ssh_runner.go:195] Run: sudo buildctl build --frontend dockerfile.v0 --local context=/var/lib/minikube/build/build.3157104403 --local dockerfile=/var/lib/minikube/build/build.3157104403 --output type=image,name=localhost/my-image:functional-941011
#1 [internal] load build definition from Dockerfile
#1 DONE 0.0s

                                                
                                                
#1 [internal] load build definition from Dockerfile
#1 transferring dockerfile: 97B done
#1 DONE 0.0s

                                                
                                                
#2 [internal] load metadata for gcr.io/k8s-minikube/busybox:latest
#2 DONE 1.5s

                                                
                                                
#3 [internal] load .dockerignore
#3 transferring context: 2B done
#3 DONE 0.0s

                                                
                                                
#4 [internal] load build context
#4 transferring context: 62B done
#4 DONE 0.0s

                                                
                                                
#5 [1/3] FROM gcr.io/k8s-minikube/busybox:latest@sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b
#5 resolve gcr.io/k8s-minikube/busybox:latest@sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b 0.0s done
#5 DONE 0.1s

                                                
                                                
#5 [1/3] FROM gcr.io/k8s-minikube/busybox:latest@sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b
#5 sha256:a01966dde7f8d5ba10b6d87e776c7c8fb5a5f6bfa678874bd28b33b1fc6dba34 0B / 828.50kB 0.2s
#5 sha256:a01966dde7f8d5ba10b6d87e776c7c8fb5a5f6bfa678874bd28b33b1fc6dba34 828.50kB / 828.50kB 0.4s done
#5 extracting sha256:a01966dde7f8d5ba10b6d87e776c7c8fb5a5f6bfa678874bd28b33b1fc6dba34 0.1s done
#5 DONE 0.6s

                                                
                                                
#6 [2/3] RUN true
#6 DONE 0.2s

                                                
                                                
#7 [3/3] ADD content.txt /
#7 DONE 0.0s

                                                
                                                
#8 exporting to image
#8 exporting layers 0.1s done
#8 exporting manifest sha256:b5a6055b8ece382ff94df8256c43e2f2b4718a85bb69708286fb4ad516e08ab0
#8 exporting manifest sha256:b5a6055b8ece382ff94df8256c43e2f2b4718a85bb69708286fb4ad516e08ab0 0.0s done
#8 exporting config sha256:0dd6d757450a764d00308d5c72c625f64f21a54f2f9c75957a4660b66eb329a5 0.0s done
#8 naming to localhost/my-image:functional-941011 done
#8 DONE 0.2s
I1124 09:07:19.177313 1693654 ssh_runner.go:235] Completed: sudo buildctl build --frontend dockerfile.v0 --local context=/var/lib/minikube/build/build.3157104403 --local dockerfile=/var/lib/minikube/build/build.3157104403 --output type=image,name=localhost/my-image:functional-941011: (2.709823572s)
I1124 09:07:19.177379 1693654 ssh_runner.go:195] Run: sudo rm -rf /var/lib/minikube/build/build.3157104403
I1124 09:07:19.186346 1693654 ssh_runner.go:195] Run: sudo rm -f /var/lib/minikube/build/build.3157104403.tar
I1124 09:07:19.195636 1693654 build_images.go:218] Built localhost/my-image:functional-941011 from /tmp/build.3157104403.tar
I1124 09:07:19.195689 1693654 build_images.go:134] succeeded building to: functional-941011
I1124 09:07:19.195696 1693654 build_images.go:135] failed building to: 
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageBuild (3.49s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/Setup (0.69s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/Setup
functional_test.go:357: (dbg) Run:  docker pull kicbase/echo-server:1.0
functional_test.go:362: (dbg) Run:  docker tag kicbase/echo-server:1.0 kicbase/echo-server:functional-941011
--- PASS: TestFunctional/parallel/ImageCommands/Setup (0.69s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_changes (0.15s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_changes
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_changes

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_changes
functional_test.go:2124: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_changes (0.15s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster (0.15s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster
functional_test.go:2124: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster (0.15s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_clusters (0.15s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_clusters
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_clusters

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_clusters
functional_test.go:2124: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_clusters (0.15s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageLoadDaemon (1.44s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageLoadDaemon
functional_test.go:370: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 image load --daemon kicbase/echo-server:functional-941011 --alsologtostderr
functional_test.go:370: (dbg) Done: out/minikube-linux-arm64 -p functional-941011 image load --daemon kicbase/echo-server:functional-941011 --alsologtostderr: (1.146617727s)
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageLoadDaemon (1.44s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageReloadDaemon (1.42s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageReloadDaemon
functional_test.go:380: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 image load --daemon kicbase/echo-server:functional-941011 --alsologtostderr
functional_test.go:380: (dbg) Done: out/minikube-linux-arm64 -p functional-941011 image load --daemon kicbase/echo-server:functional-941011 --alsologtostderr: (1.078228314s)
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageReloadDaemon (1.42s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/DeployApp (8.28s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/DeployApp
functional_test.go:1451: (dbg) Run:  kubectl --context functional-941011 create deployment hello-node --image kicbase/echo-server
functional_test.go:1455: (dbg) Run:  kubectl --context functional-941011 expose deployment hello-node --type=NodePort --port=8080
functional_test.go:1460: (dbg) TestFunctional/parallel/ServiceCmd/DeployApp: waiting 10m0s for pods matching "app=hello-node" in namespace "default" ...
helpers_test.go:352: "hello-node-75c85bcc94-srggf" [71a67da8-3223-4cb8-a06f-9f6dc61472b0] Pending / Ready:ContainersNotReady (containers with unready status: [echo-server]) / ContainersReady:ContainersNotReady (containers with unready status: [echo-server])
helpers_test.go:352: "hello-node-75c85bcc94-srggf" [71a67da8-3223-4cb8-a06f-9f6dc61472b0] Running
functional_test.go:1460: (dbg) TestFunctional/parallel/ServiceCmd/DeployApp: app=hello-node healthy within 8.003231421s
--- PASS: TestFunctional/parallel/ServiceCmd/DeployApp (8.28s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon (1.35s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon
functional_test.go:250: (dbg) Run:  docker pull kicbase/echo-server:latest
functional_test.go:255: (dbg) Run:  docker tag kicbase/echo-server:latest kicbase/echo-server:functional-941011
functional_test.go:260: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 image load --daemon kicbase/echo-server:functional-941011 --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon (1.35s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageSaveToFile (0.34s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageSaveToFile
functional_test.go:395: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 image save kicbase/echo-server:functional-941011 /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr
--- PASS: TestFunctional/parallel/ImageCommands/ImageSaveToFile (0.34s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageRemove (0.49s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageRemove
functional_test.go:407: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 image rm kicbase/echo-server:functional-941011 --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageRemove (0.49s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageLoadFromFile (0.65s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageLoadFromFile
functional_test.go:424: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 image load /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageLoadFromFile (0.65s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageSaveDaemon (0.39s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageSaveDaemon
functional_test.go:434: (dbg) Run:  docker rmi kicbase/echo-server:functional-941011
functional_test.go:439: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 image save --daemon kicbase/echo-server:functional-941011 --alsologtostderr
functional_test.go:447: (dbg) Run:  docker image inspect kicbase/echo-server:functional-941011
--- PASS: TestFunctional/parallel/ImageCommands/ImageSaveDaemon (0.39s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel (0.52s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel
functional_test_tunnel_test.go:154: (dbg) daemon: [out/minikube-linux-arm64 -p functional-941011 tunnel --alsologtostderr]
functional_test_tunnel_test.go:154: (dbg) daemon: [out/minikube-linux-arm64 -p functional-941011 tunnel --alsologtostderr]
functional_test_tunnel_test.go:194: (dbg) stopping [out/minikube-linux-arm64 -p functional-941011 tunnel --alsologtostderr] ...
helpers_test.go:525: unable to kill pid 1688452: os: process already finished
helpers_test.go:519: unable to terminate pid 1688323: os: process already finished
functional_test_tunnel_test.go:194: (dbg) stopping [out/minikube-linux-arm64 -p functional-941011 tunnel --alsologtostderr] ...
helpers_test.go:507: unable to find parent, assuming dead: process does not exist
--- PASS: TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel (0.52s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/StartTunnel (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/StartTunnel
functional_test_tunnel_test.go:129: (dbg) daemon: [out/minikube-linux-arm64 -p functional-941011 tunnel --alsologtostderr]
--- PASS: TestFunctional/parallel/TunnelCmd/serial/StartTunnel (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/List (0.37s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/List
functional_test.go:1469: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 service list
--- PASS: TestFunctional/parallel/ServiceCmd/List (0.37s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/JSONOutput (0.35s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/JSONOutput
functional_test.go:1499: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 service list -o json
functional_test.go:1504: Took "351.921877ms" to run "out/minikube-linux-arm64 -p functional-941011 service list -o json"
--- PASS: TestFunctional/parallel/ServiceCmd/JSONOutput (0.35s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/HTTPS (0.37s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/HTTPS
functional_test.go:1519: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 service --namespace=default --https --url hello-node
functional_test.go:1532: found endpoint: https://192.168.49.2:30476
--- PASS: TestFunctional/parallel/ServiceCmd/HTTPS (0.37s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/Format (0.38s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/Format
functional_test.go:1550: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 service hello-node --url --format={{.IP}}
--- PASS: TestFunctional/parallel/ServiceCmd/Format (0.38s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/URL (0.41s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/URL
functional_test.go:1569: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 service hello-node --url
functional_test.go:1575: found endpoint for hello-node: http://192.168.49.2:30476
--- PASS: TestFunctional/parallel/ServiceCmd/URL (0.41s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel (0.11s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel
functional_test_tunnel_test.go:434: (dbg) stopping [out/minikube-linux-arm64 -p functional-941011 tunnel --alsologtostderr] ...
functional_test_tunnel_test.go:437: failed to stop process: signal: terminated
--- PASS: TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel (0.11s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_not_create (0.46s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_not_create
functional_test.go:1285: (dbg) Run:  out/minikube-linux-arm64 profile lis
functional_test.go:1290: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
--- PASS: TestFunctional/parallel/ProfileCmd/profile_not_create (0.46s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_list (0.45s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_list
functional_test.go:1325: (dbg) Run:  out/minikube-linux-arm64 profile list
functional_test.go:1330: Took "388.475095ms" to run "out/minikube-linux-arm64 profile list"
functional_test.go:1339: (dbg) Run:  out/minikube-linux-arm64 profile list -l
functional_test.go:1344: Took "59.851317ms" to run "out/minikube-linux-arm64 profile list -l"
--- PASS: TestFunctional/parallel/ProfileCmd/profile_list (0.45s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_json_output (0.43s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_json_output
functional_test.go:1376: (dbg) Run:  out/minikube-linux-arm64 profile list -o json
functional_test.go:1381: Took "375.599881ms" to run "out/minikube-linux-arm64 profile list -o json"
functional_test.go:1389: (dbg) Run:  out/minikube-linux-arm64 profile list -o json --light
functional_test.go:1394: Took "52.886898ms" to run "out/minikube-linux-arm64 profile list -o json --light"
--- PASS: TestFunctional/parallel/ProfileCmd/profile_json_output (0.43s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/any-port (7.39s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/any-port
functional_test_mount_test.go:73: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-941011 /tmp/TestFunctionalparallelMountCmdany-port600351307/001:/mount-9p --alsologtostderr -v=1]
functional_test_mount_test.go:107: wrote "test-1763974919752046340" to /tmp/TestFunctionalparallelMountCmdany-port600351307/001/created-by-test
functional_test_mount_test.go:107: wrote "test-1763974919752046340" to /tmp/TestFunctionalparallelMountCmdany-port600351307/001/created-by-test-removed-by-pod
functional_test_mount_test.go:107: wrote "test-1763974919752046340" to /tmp/TestFunctionalparallelMountCmdany-port600351307/001/test-1763974919752046340
functional_test_mount_test.go:115: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:129: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 ssh -- ls -la /mount-9p
functional_test_mount_test.go:133: guest mount directory contents
total 2
-rw-r--r-- 1 docker docker 24 Nov 24 09:01 created-by-test
-rw-r--r-- 1 docker docker 24 Nov 24 09:01 created-by-test-removed-by-pod
-rw-r--r-- 1 docker docker 24 Nov 24 09:01 test-1763974919752046340
functional_test_mount_test.go:137: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 ssh cat /mount-9p/test-1763974919752046340
functional_test_mount_test.go:148: (dbg) Run:  kubectl --context functional-941011 replace --force -f testdata/busybox-mount-test.yaml
functional_test_mount_test.go:153: (dbg) TestFunctional/parallel/MountCmd/any-port: waiting 4m0s for pods matching "integration-test=busybox-mount" in namespace "default" ...
helpers_test.go:352: "busybox-mount" [bbdc1ec1-1c55-4e58-9c40-0340bcf98470] Pending
helpers_test.go:352: "busybox-mount" [bbdc1ec1-1c55-4e58-9c40-0340bcf98470] Pending / Ready:ContainersNotReady (containers with unready status: [mount-munger]) / ContainersReady:ContainersNotReady (containers with unready status: [mount-munger])
helpers_test.go:352: "busybox-mount" [bbdc1ec1-1c55-4e58-9c40-0340bcf98470] Pending / Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
helpers_test.go:352: "busybox-mount" [bbdc1ec1-1c55-4e58-9c40-0340bcf98470] Succeeded / Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
functional_test_mount_test.go:153: (dbg) TestFunctional/parallel/MountCmd/any-port: integration-test=busybox-mount healthy within 5.003819791s
functional_test_mount_test.go:169: (dbg) Run:  kubectl --context functional-941011 logs busybox-mount
functional_test_mount_test.go:181: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 ssh stat /mount-9p/created-by-test
functional_test_mount_test.go:181: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 ssh stat /mount-9p/created-by-pod
functional_test_mount_test.go:90: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:94: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-941011 /tmp/TestFunctionalparallelMountCmdany-port600351307/001:/mount-9p --alsologtostderr -v=1] ...
--- PASS: TestFunctional/parallel/MountCmd/any-port (7.39s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/specific-port (1.99s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/specific-port
functional_test_mount_test.go:213: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-941011 /tmp/TestFunctionalparallelMountCmdspecific-port1627151296/001:/mount-9p --alsologtostderr -v=1 --port 46464]
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:243: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-941011 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (339.803904ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
I1124 09:02:07.486174 1654467 retry.go:31] will retry after 582.791577ms: exit status 1
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:257: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 ssh -- ls -la /mount-9p
functional_test_mount_test.go:261: guest mount directory contents
total 0
functional_test_mount_test.go:263: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-941011 /tmp/TestFunctionalparallelMountCmdspecific-port1627151296/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
functional_test_mount_test.go:264: reading mount text
functional_test_mount_test.go:278: done reading mount text
functional_test_mount_test.go:230: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:230: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-941011 ssh "sudo umount -f /mount-9p": exit status 1 (289.927977ms)

                                                
                                                
-- stdout --
	umount: /mount-9p: not mounted.

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 32

                                                
                                                
** /stderr **
functional_test_mount_test.go:232: "out/minikube-linux-arm64 -p functional-941011 ssh \"sudo umount -f /mount-9p\"": exit status 1
functional_test_mount_test.go:234: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-941011 /tmp/TestFunctionalparallelMountCmdspecific-port1627151296/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
--- PASS: TestFunctional/parallel/MountCmd/specific-port (1.99s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/VerifyCleanup (1.3s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/VerifyCleanup
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-941011 /tmp/TestFunctionalparallelMountCmdVerifyCleanup1964745822/001:/mount1 --alsologtostderr -v=1]
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-941011 /tmp/TestFunctionalparallelMountCmdVerifyCleanup1964745822/001:/mount2 --alsologtostderr -v=1]
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-941011 /tmp/TestFunctionalparallelMountCmdVerifyCleanup1964745822/001:/mount3 --alsologtostderr -v=1]
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 ssh "findmnt -T" /mount1
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 ssh "findmnt -T" /mount2
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-arm64 -p functional-941011 ssh "findmnt -T" /mount3
functional_test_mount_test.go:370: (dbg) Run:  out/minikube-linux-arm64 mount -p functional-941011 --kill=true
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-941011 /tmp/TestFunctionalparallelMountCmdVerifyCleanup1964745822/001:/mount1 --alsologtostderr -v=1] ...
helpers_test.go:507: unable to find parent, assuming dead: process does not exist
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-941011 /tmp/TestFunctionalparallelMountCmdVerifyCleanup1964745822/001:/mount2 --alsologtostderr -v=1] ...
helpers_test.go:507: unable to find parent, assuming dead: process does not exist
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-941011 /tmp/TestFunctionalparallelMountCmdVerifyCleanup1964745822/001:/mount3 --alsologtostderr -v=1] ...
helpers_test.go:507: unable to find parent, assuming dead: process does not exist
--- PASS: TestFunctional/parallel/MountCmd/VerifyCleanup (1.30s)

                                                
                                    
x
+
TestFunctional/delete_echo-server_images (0.04s)

                                                
                                                
=== RUN   TestFunctional/delete_echo-server_images
functional_test.go:205: (dbg) Run:  docker rmi -f kicbase/echo-server:1.0
functional_test.go:205: (dbg) Run:  docker rmi -f kicbase/echo-server:functional-941011
--- PASS: TestFunctional/delete_echo-server_images (0.04s)

                                                
                                    
x
+
TestFunctional/delete_my-image_image (0.02s)

                                                
                                                
=== RUN   TestFunctional/delete_my-image_image
functional_test.go:213: (dbg) Run:  docker rmi -f localhost/my-image:functional-941011
--- PASS: TestFunctional/delete_my-image_image (0.02s)

                                                
                                    
x
+
TestFunctional/delete_minikube_cached_images (0.02s)

                                                
                                                
=== RUN   TestFunctional/delete_minikube_cached_images
functional_test.go:221: (dbg) Run:  docker rmi -f minikube-local-cache-test:functional-941011
--- PASS: TestFunctional/delete_minikube_cached_images (0.02s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CopySyncFile (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CopySyncFile
functional_test.go:1860: local sync path: /home/jenkins/minikube-integration/21978-1652607/.minikube/files/etc/test/nested/copy/1654467/hosts
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CopySyncFile (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/AuditLog (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/AuditLog
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/AuditLog (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/KubeContext (0.06s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/KubeContext
functional_test.go:696: (dbg) Run:  kubectl config current-context
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/KubeContext (0.06s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/add_remote (3.31s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/add_remote
functional_test.go:1064: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 cache add registry.k8s.io/pause:3.1
functional_test.go:1064: (dbg) Done: out/minikube-linux-arm64 -p functional-291288 cache add registry.k8s.io/pause:3.1: (1.143225794s)
functional_test.go:1064: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 cache add registry.k8s.io/pause:3.3
functional_test.go:1064: (dbg) Done: out/minikube-linux-arm64 -p functional-291288 cache add registry.k8s.io/pause:3.3: (1.121634012s)
functional_test.go:1064: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 cache add registry.k8s.io/pause:latest
functional_test.go:1064: (dbg) Done: out/minikube-linux-arm64 -p functional-291288 cache add registry.k8s.io/pause:latest: (1.041655478s)
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/add_remote (3.31s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/add_local (1.06s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/add_local
functional_test.go:1092: (dbg) Run:  docker build -t minikube-local-cache-test:functional-291288 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0serialCach3511359584/001
functional_test.go:1104: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 cache add minikube-local-cache-test:functional-291288
functional_test.go:1109: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 cache delete minikube-local-cache-test:functional-291288
functional_test.go:1098: (dbg) Run:  docker rmi minikube-local-cache-test:functional-291288
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/add_local (1.06s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/CacheDelete (0.06s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/CacheDelete
functional_test.go:1117: (dbg) Run:  out/minikube-linux-arm64 cache delete registry.k8s.io/pause:3.3
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/CacheDelete (0.06s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/list (0.06s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/list
functional_test.go:1125: (dbg) Run:  out/minikube-linux-arm64 cache list
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/list (0.06s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/verify_cache_inside_node (0.32s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/verify_cache_inside_node
functional_test.go:1139: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 ssh sudo crictl images
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/verify_cache_inside_node (0.32s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/cache_reload (1.85s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/cache_reload
functional_test.go:1162: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 ssh sudo crictl rmi registry.k8s.io/pause:latest
functional_test.go:1168: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 ssh sudo crictl inspecti registry.k8s.io/pause:latest
functional_test.go:1168: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-291288 ssh sudo crictl inspecti registry.k8s.io/pause:latest: exit status 1 (290.492349ms)

                                                
                                                
-- stdout --
	FATA[0000] no such image "registry.k8s.io/pause:latest" present 

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:1173: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 cache reload
functional_test.go:1178: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 ssh sudo crictl inspecti registry.k8s.io/pause:latest
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/cache_reload (1.85s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/delete (0.12s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/delete
functional_test.go:1187: (dbg) Run:  out/minikube-linux-arm64 cache delete registry.k8s.io/pause:3.1
functional_test.go:1187: (dbg) Run:  out/minikube-linux-arm64 cache delete registry.k8s.io/pause:latest
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/delete (0.12s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/LogsCmd (0.97s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/LogsCmd
functional_test.go:1251: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 logs
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/LogsCmd (0.97s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/LogsFileCmd (1.03s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/LogsFileCmd
functional_test.go:1265: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 logs --file /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0serialLogs2213327218/001/logs.txt
functional_test.go:1265: (dbg) Done: out/minikube-linux-arm64 -p functional-291288 logs --file /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0serialLogs2213327218/001/logs.txt: (1.027991612s)
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/LogsFileCmd (1.03s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ConfigCmd (0.45s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ConfigCmd
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ConfigCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ConfigCmd
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 config unset cpus
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 config get cpus
functional_test.go:1214: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-291288 config get cpus: exit status 14 (71.464843ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 config set cpus 2
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 config get cpus
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 config unset cpus
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 config get cpus
functional_test.go:1214: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-291288 config get cpus: exit status 14 (60.526105ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ConfigCmd (0.45s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DryRun (0.44s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DryRun
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DryRun

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DryRun
functional_test.go:989: (dbg) Run:  out/minikube-linux-arm64 start -p functional-291288 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0
functional_test.go:989: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-291288 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0: exit status 23 (191.491322ms)

                                                
                                                
-- stdout --
	* [functional-291288] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=21978
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/21978-1652607/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/21978-1652607/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the docker driver based on existing profile
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1124 09:39:56.761556 1725783 out.go:360] Setting OutFile to fd 1 ...
	I1124 09:39:56.761702 1725783 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:39:56.761713 1725783 out.go:374] Setting ErrFile to fd 2...
	I1124 09:39:56.761746 1725783 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:39:56.762028 1725783 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
	I1124 09:39:56.762436 1725783 out.go:368] Setting JSON to false
	I1124 09:39:56.763350 1725783 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":30126,"bootTime":1763947071,"procs":159,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1124 09:39:56.763419 1725783 start.go:143] virtualization:  
	I1124 09:39:56.766686 1725783 out.go:179] * [functional-291288] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1124 09:39:56.769681 1725783 out.go:179]   - MINIKUBE_LOCATION=21978
	I1124 09:39:56.769787 1725783 notify.go:221] Checking for updates...
	I1124 09:39:56.775264 1725783 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1124 09:39:56.778073 1725783 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 09:39:56.780955 1725783 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21978-1652607/.minikube
	I1124 09:39:56.783863 1725783 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1124 09:39:56.786644 1725783 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1124 09:39:56.790079 1725783 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1124 09:39:56.790695 1725783 driver.go:422] Setting default libvirt URI to qemu:///system
	I1124 09:39:56.818482 1725783 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1124 09:39:56.818670 1725783 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 09:39:56.886370 1725783 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-11-24 09:39:56.87644877 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aa
rch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pa
th:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 09:39:56.886513 1725783 docker.go:319] overlay module found
	I1124 09:39:56.889598 1725783 out.go:179] * Using the docker driver based on existing profile
	I1124 09:39:56.892569 1725783 start.go:309] selected driver: docker
	I1124 09:39:56.892588 1725783 start.go:927] validating driver "docker" against &{Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p Mou
ntUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:39:56.892693 1725783 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1124 09:39:56.896314 1725783 out.go:203] 
	W1124 09:39:56.899241 1725783 out.go:285] X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	I1124 09:39:56.902150 1725783 out.go:203] 

                                                
                                                
** /stderr **
functional_test.go:1006: (dbg) Run:  out/minikube-linux-arm64 start -p functional-291288 --dry-run --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DryRun (0.44s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/InternationalLanguage (0.22s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/InternationalLanguage
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/InternationalLanguage

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/InternationalLanguage
functional_test.go:1035: (dbg) Run:  out/minikube-linux-arm64 start -p functional-291288 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0
functional_test.go:1035: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-291288 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0: exit status 23 (215.680798ms)

                                                
                                                
-- stdout --
	* [functional-291288] minikube v1.37.0 sur Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=21978
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/21978-1652607/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/21978-1652607/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Utilisation du pilote docker basé sur le profil existant
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1124 09:39:56.549330 1725731 out.go:360] Setting OutFile to fd 1 ...
	I1124 09:39:56.549509 1725731 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:39:56.549544 1725731 out.go:374] Setting ErrFile to fd 2...
	I1124 09:39:56.549559 1725731 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:39:56.549940 1725731 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
	I1124 09:39:56.550368 1725731 out.go:368] Setting JSON to false
	I1124 09:39:56.551301 1725731 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":30126,"bootTime":1763947071,"procs":159,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1124 09:39:56.551375 1725731 start.go:143] virtualization:  
	I1124 09:39:56.554840 1725731 out.go:179] * [functional-291288] minikube v1.37.0 sur Ubuntu 20.04 (arm64)
	I1124 09:39:56.557725 1725731 out.go:179]   - MINIKUBE_LOCATION=21978
	I1124 09:39:56.557815 1725731 notify.go:221] Checking for updates...
	I1124 09:39:56.563514 1725731 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1124 09:39:56.566360 1725731 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21978-1652607/kubeconfig
	I1124 09:39:56.569105 1725731 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21978-1652607/.minikube
	I1124 09:39:56.571909 1725731 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1124 09:39:56.574700 1725731 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1124 09:39:56.577852 1725731 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1124 09:39:56.578541 1725731 driver.go:422] Setting default libvirt URI to qemu:///system
	I1124 09:39:56.609289 1725731 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1124 09:39:56.609417 1725731 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 09:39:56.692179 1725731 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-11-24 09:39:56.680463963 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 09:39:56.692311 1725731 docker.go:319] overlay module found
	I1124 09:39:56.697428 1725731 out.go:179] * Utilisation du pilote docker basé sur le profil existant
	I1124 09:39:56.700296 1725731 start.go:309] selected driver: docker
	I1124 09:39:56.700319 1725731 start.go:927] validating driver "docker" against &{Name:functional-291288 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1763789673-21948@sha256:bb10ebd3ca086eea12c038085866fb2f6cfa67385dcb830c4deb5e36ced6b53f Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-291288 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p Mou
ntUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1124 09:39:56.700408 1725731 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1124 09:39:56.703872 1725731 out.go:203] 
	W1124 09:39:56.706785 1725731 out.go:285] X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	I1124 09:39:56.709643 1725731 out.go:203] 

                                                
                                                
** /stderr **
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/InternationalLanguage (0.22s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/AddonsCmd (0.16s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/AddonsCmd
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/AddonsCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/AddonsCmd
functional_test.go:1695: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 addons list
functional_test.go:1707: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 addons list -o json
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/AddonsCmd (0.16s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/SSHCmd (0.75s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/SSHCmd
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/SSHCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/SSHCmd
functional_test.go:1730: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 ssh "echo hello"
functional_test.go:1747: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 ssh "cat /etc/hostname"
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/SSHCmd (0.75s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/CpCmd (2.17s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/CpCmd
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/CpCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/CpCmd
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 cp testdata/cp-test.txt /home/docker/cp-test.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 ssh -n functional-291288 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 cp functional-291288:/home/docker/cp-test.txt /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelCp4225058004/001/cp-test.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 ssh -n functional-291288 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 cp testdata/cp-test.txt /tmp/does/not/exist/cp-test.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 ssh -n functional-291288 "sudo cat /tmp/does/not/exist/cp-test.txt"
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/CpCmd (2.17s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/FileSync (0.35s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/FileSync
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/FileSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/FileSync
functional_test.go:1934: Checking for existence of /etc/test/nested/copy/1654467/hosts within VM
functional_test.go:1936: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 ssh "sudo cat /etc/test/nested/copy/1654467/hosts"
functional_test.go:1941: file sync test content: Test file for checking file sync process
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/FileSync (0.35s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/CertSync (2.14s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/CertSync
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/CertSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/CertSync
functional_test.go:1977: Checking for existence of /etc/ssl/certs/1654467.pem within VM
functional_test.go:1978: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 ssh "sudo cat /etc/ssl/certs/1654467.pem"
functional_test.go:1977: Checking for existence of /usr/share/ca-certificates/1654467.pem within VM
functional_test.go:1978: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 ssh "sudo cat /usr/share/ca-certificates/1654467.pem"
functional_test.go:1977: Checking for existence of /etc/ssl/certs/51391683.0 within VM
functional_test.go:1978: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 ssh "sudo cat /etc/ssl/certs/51391683.0"
functional_test.go:2004: Checking for existence of /etc/ssl/certs/16544672.pem within VM
functional_test.go:2005: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 ssh "sudo cat /etc/ssl/certs/16544672.pem"
functional_test.go:2004: Checking for existence of /usr/share/ca-certificates/16544672.pem within VM
functional_test.go:2005: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 ssh "sudo cat /usr/share/ca-certificates/16544672.pem"
functional_test.go:2004: Checking for existence of /etc/ssl/certs/3ec20f2e.0 within VM
functional_test.go:2005: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 ssh "sudo cat /etc/ssl/certs/3ec20f2e.0"
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/CertSync (2.14s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NonActiveRuntimeDisabled (0.73s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NonActiveRuntimeDisabled
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NonActiveRuntimeDisabled

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NonActiveRuntimeDisabled
functional_test.go:2032: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 ssh "sudo systemctl is-active docker"
functional_test.go:2032: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-291288 ssh "sudo systemctl is-active docker": exit status 1 (341.226753ms)

                                                
                                                
-- stdout --
	inactive

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
functional_test.go:2032: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 ssh "sudo systemctl is-active crio"
functional_test.go:2032: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-291288 ssh "sudo systemctl is-active crio": exit status 1 (392.246396ms)

                                                
                                                
-- stdout --
	inactive

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NonActiveRuntimeDisabled (0.73s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/License (0.35s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/License
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/License

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/License
functional_test.go:2293: (dbg) Run:  out/minikube-linux-arm64 license
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/License (0.35s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/Version/short (0.07s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/Version/short
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/Version/short

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/Version/short
functional_test.go:2261: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 version --short
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/Version/short (0.07s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/Version/components (0.49s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/Version/components
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/Version/components

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/Version/components
functional_test.go:2275: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 version -o=json --components
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/Version/components (0.49s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListShort (0.53s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListShort
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListShort

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListShort
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 image ls --format short --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-291288 image ls --format short --alsologtostderr:
registry.k8s.io/pause:latest
registry.k8s.io/pause:3.3
registry.k8s.io/pause:3.10.1
registry.k8s.io/pause:3.1
registry.k8s.io/kube-scheduler:v1.35.0-beta.0
registry.k8s.io/kube-proxy:v1.35.0-beta.0
registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
registry.k8s.io/kube-apiserver:v1.35.0-beta.0
registry.k8s.io/etcd:3.6.5-0
registry.k8s.io/etcd:3.5.24-0
registry.k8s.io/coredns/coredns:v1.13.1
gcr.io/k8s-minikube/storage-provisioner:v5
docker.io/library/minikube-local-cache-test:functional-291288
docker.io/kicbase/echo-server:functional-291288
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-291288 image ls --format short --alsologtostderr:
I1124 09:39:59.955439 1726505 out.go:360] Setting OutFile to fd 1 ...
I1124 09:39:59.955623 1726505 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1124 09:39:59.955636 1726505 out.go:374] Setting ErrFile to fd 2...
I1124 09:39:59.955644 1726505 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1124 09:39:59.955927 1726505 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
I1124 09:39:59.956616 1726505 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
I1124 09:39:59.956786 1726505 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
I1124 09:39:59.957342 1726505 cli_runner.go:164] Run: docker container inspect functional-291288 --format={{.State.Status}}
I1124 09:39:59.974415 1726505 ssh_runner.go:195] Run: systemctl --version
I1124 09:39:59.974500 1726505 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
I1124 09:39:59.993649 1726505 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
I1124 09:40:00.256659 1726505 ssh_runner.go:195] Run: sudo crictl images --output json
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListShort (0.53s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListTable (0.23s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListTable
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListTable

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListTable
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 image ls --format table --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-291288 image ls --format table --alsologtostderr:
┌─────────────────────────────────────────────┬───────────────────┬───────────────┬────────┐
│                    IMAGE                    │        TAG        │   IMAGE ID    │  SIZE  │
├─────────────────────────────────────────────┼───────────────────┼───────────────┼────────┤
│ docker.io/kicbase/echo-server               │ functional-291288 │ sha256:ce2d2c │ 2.17MB │
│ gcr.io/k8s-minikube/storage-provisioner     │ v5                │ sha256:667491 │ 8.03MB │
│ localhost/my-image                          │ functional-291288 │ sha256:ecd640 │ 831kB  │
│ registry.k8s.io/etcd                        │ 3.5.24-0          │ sha256:121140 │ 21.9MB │
│ registry.k8s.io/pause                       │ 3.10.1            │ sha256:d7b100 │ 265kB  │
│ registry.k8s.io/coredns/coredns             │ v1.13.1           │ sha256:e08f4d │ 21.2MB │
│ registry.k8s.io/kube-controller-manager     │ v1.35.0-beta.0    │ sha256:68b5f7 │ 20.7MB │
│ registry.k8s.io/pause                       │ 3.3               │ sha256:3d1873 │ 249kB  │
│ registry.k8s.io/kube-apiserver              │ v1.35.0-beta.0    │ sha256:ccd634 │ 24.7MB │
│ registry.k8s.io/pause                       │ 3.1               │ sha256:8057e0 │ 262kB  │
│ docker.io/library/minikube-local-cache-test │ functional-291288 │ sha256:cee23f │ 991B   │
│ registry.k8s.io/etcd                        │ 3.6.5-0           │ sha256:2c5f0d │ 21.1MB │
│ registry.k8s.io/kube-proxy                  │ v1.35.0-beta.0    │ sha256:404c2e │ 22.4MB │
│ registry.k8s.io/kube-scheduler              │ v1.35.0-beta.0    │ sha256:163787 │ 15.4MB │
│ registry.k8s.io/pause                       │ latest            │ sha256:8cb209 │ 71.3kB │
└─────────────────────────────────────────────┴───────────────────┴───────────────┴────────┘
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-291288 image ls --format table --alsologtostderr:
I1124 09:40:04.609796 1726897 out.go:360] Setting OutFile to fd 1 ...
I1124 09:40:04.609923 1726897 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1124 09:40:04.609932 1726897 out.go:374] Setting ErrFile to fd 2...
I1124 09:40:04.609937 1726897 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1124 09:40:04.610536 1726897 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
I1124 09:40:04.611904 1726897 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
I1124 09:40:04.612064 1726897 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
I1124 09:40:04.612775 1726897 cli_runner.go:164] Run: docker container inspect functional-291288 --format={{.State.Status}}
I1124 09:40:04.631105 1726897 ssh_runner.go:195] Run: systemctl --version
I1124 09:40:04.631167 1726897 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
I1124 09:40:04.648507 1726897 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
I1124 09:40:04.753033 1726897 ssh_runner.go:195] Run: sudo crictl images --output json
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListTable (0.23s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListJson (0.24s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListJson
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListJson

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListJson
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 image ls --format json --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-291288 image ls --format json --alsologtostderr:
[{"id":"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.10.1"],"size":"265458"},{"id":"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17","repoDigests":[],"repoTags":["docker.io/kicbase/echo-server:functional-291288"],"size":"2173567"},{"id":"sha256:404c2e12861777b763b8feaa316d36680fc68ad308a8d2f6e55f1bb981cdd904","repoDigests":[],"repoTags":["registry.k8s.io/kube-proxy:v1.35.0-beta.0"],"size":"22428165"},{"id":"sha256:8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.1"],"size":"262191"},{"id":"sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf","repoDigests":[],"repoTags":["registry.k8s.io/coredns/coredns:v1.13.1"],"size":"21166088"},{"id":"sha256:16378741539f1be9c6e347d127537d379a6592587b09b4eb47964cb5c43a409b","repoDigests":[],"repoTags":["registry.k8s.io/kube-scheduler:v1.35.0-beta.0"],"size":"15389290"},{"i
d":"sha256:cee23f3226e286bed41a2ff2b5fad8e6e395a2934880a19d2a902b07140a4221","repoDigests":[],"repoTags":["docker.io/library/minikube-local-cache-test:functional-291288"],"size":"991"},{"id":"sha256:ecd6403f78577a6f280ca2286cd284f80ed0beb4d5904124a567ceee20dd7903","repoDigests":[],"repoTags":["localhost/my-image:functional-291288"],"size":"830617"},{"id":"sha256:1211402d28f5813ed906916bfcdd0a7404c2f9048ef5bb54387a6745bc410eca","repoDigests":[],"repoTags":["registry.k8s.io/etcd:3.5.24-0"],"size":"21880804"},{"id":"sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42","repoDigests":["registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534"],"repoTags":["registry.k8s.io/etcd:3.6.5-0"],"size":"21136588"},{"id":"sha256:68b5f775f18769fcb77bd8474c80bda2050163b6c66f4551f352b7381b8ca5be","repoDigests":[],"repoTags":["registry.k8s.io/kube-controller-manager:v1.35.0-beta.0"],"size":"20658969"},{"id":"sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1
c4724a","repoDigests":[],"repoTags":["registry.k8s.io/pause:latest"],"size":"71300"},{"id":"sha256:66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51","repoDigests":[],"repoTags":["gcr.io/k8s-minikube/storage-provisioner:v5"],"size":"8032639"},{"id":"sha256:ccd634d9bcc36ac6235e9c86676cd3a02c06afc3788a25f1bbf39ca7d44585f4","repoDigests":[],"repoTags":["registry.k8s.io/kube-apiserver:v1.35.0-beta.0"],"size":"24676285"},{"id":"sha256:3d18732f8686cc3c878055d99a05fa80289502fa496b36b6a0fe0f77206a7300","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.3"],"size":"249461"}]
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-291288 image ls --format json --alsologtostderr:
I1124 09:40:04.367352 1726854 out.go:360] Setting OutFile to fd 1 ...
I1124 09:40:04.367494 1726854 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1124 09:40:04.367539 1726854 out.go:374] Setting ErrFile to fd 2...
I1124 09:40:04.367553 1726854 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1124 09:40:04.367880 1726854 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
I1124 09:40:04.368544 1726854 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
I1124 09:40:04.368715 1726854 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
I1124 09:40:04.369324 1726854 cli_runner.go:164] Run: docker container inspect functional-291288 --format={{.State.Status}}
I1124 09:40:04.399300 1726854 ssh_runner.go:195] Run: systemctl --version
I1124 09:40:04.399367 1726854 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
I1124 09:40:04.420696 1726854 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
I1124 09:40:04.525371 1726854 ssh_runner.go:195] Run: sudo crictl images --output json
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListJson (0.24s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListYaml (0.3s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListYaml
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListYaml

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListYaml
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 image ls --format yaml --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-291288 image ls --format yaml --alsologtostderr:
- id: sha256:404c2e12861777b763b8feaa316d36680fc68ad308a8d2f6e55f1bb981cdd904
repoDigests: []
repoTags:
- registry.k8s.io/kube-proxy:v1.35.0-beta.0
size: "22428165"
- id: sha256:16378741539f1be9c6e347d127537d379a6592587b09b4eb47964cb5c43a409b
repoDigests: []
repoTags:
- registry.k8s.io/kube-scheduler:v1.35.0-beta.0
size: "15389290"
- id: sha256:8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.1
size: "262191"
- id: sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.10.1
size: "265458"
- id: sha256:3d18732f8686cc3c878055d99a05fa80289502fa496b36b6a0fe0f77206a7300
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.3
size: "249461"
- id: sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17
repoDigests: []
repoTags:
- docker.io/kicbase/echo-server:functional-291288
size: "2173567"
- id: sha256:cee23f3226e286bed41a2ff2b5fad8e6e395a2934880a19d2a902b07140a4221
repoDigests: []
repoTags:
- docker.io/library/minikube-local-cache-test:functional-291288
size: "991"
- id: sha256:66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51
repoDigests: []
repoTags:
- gcr.io/k8s-minikube/storage-provisioner:v5
size: "8032639"
- id: sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf
repoDigests: []
repoTags:
- registry.k8s.io/coredns/coredns:v1.13.1
size: "21166088"
- id: sha256:ccd634d9bcc36ac6235e9c86676cd3a02c06afc3788a25f1bbf39ca7d44585f4
repoDigests: []
repoTags:
- registry.k8s.io/kube-apiserver:v1.35.0-beta.0
size: "24676285"
- id: sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a
repoDigests: []
repoTags:
- registry.k8s.io/pause:latest
size: "71300"
- id: sha256:1211402d28f5813ed906916bfcdd0a7404c2f9048ef5bb54387a6745bc410eca
repoDigests: []
repoTags:
- registry.k8s.io/etcd:3.5.24-0
size: "21880804"
- id: sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42
repoDigests:
- registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534
repoTags:
- registry.k8s.io/etcd:3.6.5-0
size: "21136588"
- id: sha256:68b5f775f18769fcb77bd8474c80bda2050163b6c66f4551f352b7381b8ca5be
repoDigests: []
repoTags:
- registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
size: "20658969"

                                                
                                                
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-291288 image ls --format yaml --alsologtostderr:
I1124 09:40:00.524908 1726545 out.go:360] Setting OutFile to fd 1 ...
I1124 09:40:00.525181 1726545 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1124 09:40:00.525213 1726545 out.go:374] Setting ErrFile to fd 2...
I1124 09:40:00.525236 1726545 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1124 09:40:00.525583 1726545 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
I1124 09:40:00.526387 1726545 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
I1124 09:40:00.526655 1726545 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
I1124 09:40:00.527336 1726545 cli_runner.go:164] Run: docker container inspect functional-291288 --format={{.State.Status}}
I1124 09:40:00.564289 1726545 ssh_runner.go:195] Run: systemctl --version
I1124 09:40:00.564352 1726545 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
I1124 09:40:00.593598 1726545 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
I1124 09:40:00.701775 1726545 ssh_runner.go:195] Run: sudo crictl images --output json
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListYaml (0.30s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageBuild (3.58s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageBuild
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageBuild

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageBuild
functional_test.go:323: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 ssh pgrep buildkitd
functional_test.go:323: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-291288 ssh pgrep buildkitd: exit status 1 (290.320124ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:330: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 image build -t localhost/my-image:functional-291288 testdata/build --alsologtostderr
functional_test.go:330: (dbg) Done: out/minikube-linux-arm64 -p functional-291288 image build -t localhost/my-image:functional-291288 testdata/build --alsologtostderr: (3.063648111s)
functional_test.go:338: (dbg) Stderr: out/minikube-linux-arm64 -p functional-291288 image build -t localhost/my-image:functional-291288 testdata/build --alsologtostderr:
I1124 09:40:01.082767 1726646 out.go:360] Setting OutFile to fd 1 ...
I1124 09:40:01.082888 1726646 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1124 09:40:01.082901 1726646 out.go:374] Setting ErrFile to fd 2...
I1124 09:40:01.082906 1726646 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1124 09:40:01.083177 1726646 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
I1124 09:40:01.083828 1726646 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
I1124 09:40:01.084532 1726646 config.go:182] Loaded profile config "functional-291288": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
I1124 09:40:01.085108 1726646 cli_runner.go:164] Run: docker container inspect functional-291288 --format={{.State.Status}}
I1124 09:40:01.103144 1726646 ssh_runner.go:195] Run: systemctl --version
I1124 09:40:01.103216 1726646 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-291288
I1124 09:40:01.121858 1726646 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34684 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/functional-291288/id_rsa Username:docker}
I1124 09:40:01.229405 1726646 build_images.go:162] Building image from path: /tmp/build.1218485702.tar
I1124 09:40:01.229500 1726646 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/build
I1124 09:40:01.237832 1726646 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/build/build.1218485702.tar
I1124 09:40:01.241979 1726646 ssh_runner.go:352] existence check for /var/lib/minikube/build/build.1218485702.tar: stat -c "%s %y" /var/lib/minikube/build/build.1218485702.tar: Process exited with status 1
stdout:

                                                
                                                
stderr:
stat: cannot statx '/var/lib/minikube/build/build.1218485702.tar': No such file or directory
I1124 09:40:01.242016 1726646 ssh_runner.go:362] scp /tmp/build.1218485702.tar --> /var/lib/minikube/build/build.1218485702.tar (3072 bytes)
I1124 09:40:01.260767 1726646 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/build/build.1218485702
I1124 09:40:01.268892 1726646 ssh_runner.go:195] Run: sudo tar -C /var/lib/minikube/build/build.1218485702 -xf /var/lib/minikube/build/build.1218485702.tar
I1124 09:40:01.277435 1726646 containerd.go:394] Building image: /var/lib/minikube/build/build.1218485702
I1124 09:40:01.277534 1726646 ssh_runner.go:195] Run: sudo buildctl build --frontend dockerfile.v0 --local context=/var/lib/minikube/build/build.1218485702 --local dockerfile=/var/lib/minikube/build/build.1218485702 --output type=image,name=localhost/my-image:functional-291288
#1 [internal] load build definition from Dockerfile
#1 transferring dockerfile: 97B done
#1 DONE 0.0s

                                                
                                                
#2 [internal] load metadata for gcr.io/k8s-minikube/busybox:latest
#2 DONE 1.5s

                                                
                                                
#3 [internal] load .dockerignore
#3 transferring context: 2B done
#3 DONE 0.0s

                                                
                                                
#4 [internal] load build context
#4 transferring context: 62B done
#4 DONE 0.0s

                                                
                                                
#5 [1/3] FROM gcr.io/k8s-minikube/busybox:latest@sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b
#5 resolve gcr.io/k8s-minikube/busybox:latest@sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b 0.0s done
#5 DONE 0.1s

                                                
                                                
#5 [1/3] FROM gcr.io/k8s-minikube/busybox:latest@sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b
#5 sha256:a01966dde7f8d5ba10b6d87e776c7c8fb5a5f6bfa678874bd28b33b1fc6dba34 0B / 828.50kB 0.2s
#5 sha256:a01966dde7f8d5ba10b6d87e776c7c8fb5a5f6bfa678874bd28b33b1fc6dba34 828.50kB / 828.50kB 0.4s done
#5 extracting sha256:a01966dde7f8d5ba10b6d87e776c7c8fb5a5f6bfa678874bd28b33b1fc6dba34 0.1s done
#5 DONE 0.6s

                                                
                                                
#6 [2/3] RUN true
#6 DONE 0.2s

                                                
                                                
#7 [3/3] ADD content.txt /
#7 DONE 0.0s

                                                
                                                
#8 exporting to image
#8 exporting layers 0.1s done
#8 exporting manifest sha256:c0a42c835e2e146f3f08c80a520e25607967ac2cba086070c714b0b616cb614b
#8 exporting manifest sha256:c0a42c835e2e146f3f08c80a520e25607967ac2cba086070c714b0b616cb614b 0.0s done
#8 exporting config sha256:ecd6403f78577a6f280ca2286cd284f80ed0beb4d5904124a567ceee20dd7903 0.0s done
#8 naming to localhost/my-image:functional-291288 done
#8 DONE 0.2s
I1124 09:40:04.066033 1726646 ssh_runner.go:235] Completed: sudo buildctl build --frontend dockerfile.v0 --local context=/var/lib/minikube/build/build.1218485702 --local dockerfile=/var/lib/minikube/build/build.1218485702 --output type=image,name=localhost/my-image:functional-291288: (2.788458079s)
I1124 09:40:04.066120 1726646 ssh_runner.go:195] Run: sudo rm -rf /var/lib/minikube/build/build.1218485702
I1124 09:40:04.075177 1726646 ssh_runner.go:195] Run: sudo rm -f /var/lib/minikube/build/build.1218485702.tar
I1124 09:40:04.084472 1726646 build_images.go:218] Built localhost/my-image:functional-291288 from /tmp/build.1218485702.tar
I1124 09:40:04.084546 1726646 build_images.go:134] succeeded building to: functional-291288
I1124 09:40:04.084569 1726646 build_images.go:135] failed building to: 
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 image ls
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageBuild (3.58s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/Setup (0.25s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/Setup
functional_test.go:357: (dbg) Run:  docker pull kicbase/echo-server:1.0
functional_test.go:362: (dbg) Run:  docker tag kicbase/echo-server:1.0 kicbase/echo-server:functional-291288
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/Setup (0.25s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageLoadDaemon (1.42s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageLoadDaemon
functional_test.go:370: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 image load --daemon kicbase/echo-server:functional-291288 --alsologtostderr
functional_test.go:370: (dbg) Done: out/minikube-linux-arm64 -p functional-291288 image load --daemon kicbase/echo-server:functional-291288 --alsologtostderr: (1.124848077s)
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 image ls
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageLoadDaemon (1.42s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageReloadDaemon (1.3s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageReloadDaemon
functional_test.go:380: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 image load --daemon kicbase/echo-server:functional-291288 --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 image ls
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageReloadDaemon (1.30s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageTagAndLoadDaemon (1.74s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageTagAndLoadDaemon
functional_test.go:250: (dbg) Run:  docker pull kicbase/echo-server:latest
functional_test.go:255: (dbg) Run:  docker tag kicbase/echo-server:latest kicbase/echo-server:functional-291288
functional_test.go:260: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 image load --daemon kicbase/echo-server:functional-291288 --alsologtostderr
functional_test.go:260: (dbg) Done: out/minikube-linux-arm64 -p functional-291288 image load --daemon kicbase/echo-server:functional-291288 --alsologtostderr: (1.174865356s)
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 image ls
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageTagAndLoadDaemon (1.74s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_changes (0.15s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_changes
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_changes

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_changes
functional_test.go:2124: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 update-context --alsologtostderr -v=2
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_changes (0.15s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_minikube_cluster (0.15s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_minikube_cluster
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_minikube_cluster

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_minikube_cluster
functional_test.go:2124: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 update-context --alsologtostderr -v=2
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_minikube_cluster (0.15s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_clusters (0.16s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_clusters
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_clusters

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_clusters
functional_test.go:2124: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 update-context --alsologtostderr -v=2
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_clusters (0.16s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageSaveToFile (0.47s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageSaveToFile
functional_test.go:395: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 image save kicbase/echo-server:functional-291288 /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageSaveToFile (0.47s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageRemove (0.54s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageRemove
functional_test.go:407: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 image rm kicbase/echo-server:functional-291288 --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 image ls
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageRemove (0.54s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageLoadFromFile (0.9s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageLoadFromFile
functional_test.go:424: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 image load /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 image ls
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageLoadFromFile (0.90s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageSaveDaemon (0.47s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageSaveDaemon
functional_test.go:434: (dbg) Run:  docker rmi kicbase/echo-server:functional-291288
functional_test.go:439: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 image save --daemon kicbase/echo-server:functional-291288 --alsologtostderr
functional_test.go:447: (dbg) Run:  docker image inspect kicbase/echo-server:functional-291288
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageSaveDaemon (0.47s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/StartTunnel (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/StartTunnel
functional_test_tunnel_test.go:129: (dbg) daemon: [out/minikube-linux-arm64 -p functional-291288 tunnel --alsologtostderr]
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/StartTunnel (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/DeleteTunnel (0.1s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/DeleteTunnel
functional_test_tunnel_test.go:434: (dbg) stopping [out/minikube-linux-arm64 -p functional-291288 tunnel --alsologtostderr] ...
functional_test_tunnel_test.go:437: failed to stop process: exit status 103
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/DeleteTunnel (0.10s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ProfileCmd/profile_not_create (0.44s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ProfileCmd/profile_not_create
functional_test.go:1285: (dbg) Run:  out/minikube-linux-arm64 profile lis
functional_test.go:1290: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ProfileCmd/profile_not_create (0.44s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ProfileCmd/profile_list (0.4s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ProfileCmd/profile_list
functional_test.go:1325: (dbg) Run:  out/minikube-linux-arm64 profile list
functional_test.go:1330: Took "337.440223ms" to run "out/minikube-linux-arm64 profile list"
functional_test.go:1339: (dbg) Run:  out/minikube-linux-arm64 profile list -l
functional_test.go:1344: Took "58.733636ms" to run "out/minikube-linux-arm64 profile list -l"
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ProfileCmd/profile_list (0.40s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ProfileCmd/profile_json_output (0.44s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ProfileCmd/profile_json_output
functional_test.go:1376: (dbg) Run:  out/minikube-linux-arm64 profile list -o json
functional_test.go:1381: Took "375.057818ms" to run "out/minikube-linux-arm64 profile list -o json"
functional_test.go:1389: (dbg) Run:  out/minikube-linux-arm64 profile list -o json --light
functional_test.go:1394: Took "62.557847ms" to run "out/minikube-linux-arm64 profile list -o json --light"
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ProfileCmd/profile_json_output (0.44s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MountCmd/specific-port (1.94s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MountCmd/specific-port
functional_test_mount_test.go:213: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-291288 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo2133955644/001:/mount-9p --alsologtostderr -v=1 --port 46464]
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:243: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-291288 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (330.47737ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
I1124 09:39:49.813457 1654467 retry.go:31] will retry after 564.047381ms: exit status 1
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:257: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 ssh -- ls -la /mount-9p
functional_test_mount_test.go:261: guest mount directory contents
total 0
functional_test_mount_test.go:263: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-291288 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo2133955644/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
functional_test_mount_test.go:264: reading mount text
functional_test_mount_test.go:278: done reading mount text
functional_test_mount_test.go:230: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:230: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-291288 ssh "sudo umount -f /mount-9p": exit status 1 (272.128912ms)

                                                
                                                
-- stdout --
	umount: /mount-9p: not mounted.

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 32

                                                
                                                
** /stderr **
functional_test_mount_test.go:232: "out/minikube-linux-arm64 -p functional-291288 ssh \"sudo umount -f /mount-9p\"": exit status 1
functional_test_mount_test.go:234: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-291288 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo2133955644/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MountCmd/specific-port (1.94s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MountCmd/VerifyCleanup (1.9s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MountCmd/VerifyCleanup
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-291288 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo2066364349/001:/mount1 --alsologtostderr -v=1]
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-291288 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo2066364349/001:/mount2 --alsologtostderr -v=1]
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-291288 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo2066364349/001:/mount3 --alsologtostderr -v=1]
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 ssh "findmnt -T" /mount1
functional_test_mount_test.go:325: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-291288 ssh "findmnt -T" /mount1: exit status 1 (577.580178ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
I1124 09:39:52.005340 1654467 retry.go:31] will retry after 435.979268ms: exit status 1
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 ssh "findmnt -T" /mount1
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 ssh "findmnt -T" /mount2
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-arm64 -p functional-291288 ssh "findmnt -T" /mount3
functional_test_mount_test.go:370: (dbg) Run:  out/minikube-linux-arm64 mount -p functional-291288 --kill=true
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-291288 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo2066364349/001:/mount1 --alsologtostderr -v=1] ...
helpers_test.go:507: unable to find parent, assuming dead: process does not exist
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-291288 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo2066364349/001:/mount2 --alsologtostderr -v=1] ...
helpers_test.go:507: unable to find parent, assuming dead: process does not exist
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-291288 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo2066364349/001:/mount3 --alsologtostderr -v=1] ...
helpers_test.go:507: unable to find parent, assuming dead: process does not exist
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MountCmd/VerifyCleanup (1.90s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/delete_echo-server_images (0.04s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/delete_echo-server_images
functional_test.go:205: (dbg) Run:  docker rmi -f kicbase/echo-server:1.0
functional_test.go:205: (dbg) Run:  docker rmi -f kicbase/echo-server:functional-291288
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/delete_echo-server_images (0.04s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/delete_my-image_image (0.02s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/delete_my-image_image
functional_test.go:213: (dbg) Run:  docker rmi -f localhost/my-image:functional-291288
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/delete_my-image_image (0.02s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/delete_minikube_cached_images (0.02s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/delete_minikube_cached_images
functional_test.go:221: (dbg) Run:  docker rmi -f minikube-local-cache-test:functional-291288
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/delete_minikube_cached_images (0.02s)

                                                
                                    
x
+
TestMultiControlPlane/serial/StartCluster (170.6s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/StartCluster
ha_test.go:101: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 start --ha --memory 3072 --wait true --alsologtostderr -v 5 --driver=docker  --container-runtime=containerd
E1124 09:43:14.515680 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:43:14.522031 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:43:14.533298 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:43:14.554634 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:43:14.595993 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:43:14.677358 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:43:14.839012 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:43:15.160706 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:43:15.802733 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:43:17.084049 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:43:19.646619 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:43:24.767997 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:43:35.010213 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:43:55.492469 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:44:27.786617 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:44:36.453759 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
ha_test.go:101: (dbg) Done: out/minikube-linux-arm64 -p ha-569070 start --ha --memory 3072 --wait true --alsologtostderr -v 5 --driver=docker  --container-runtime=containerd: (2m49.72796134s)
ha_test.go:107: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 status --alsologtostderr -v 5
--- PASS: TestMultiControlPlane/serial/StartCluster (170.60s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DeployApp (8.2s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DeployApp
ha_test.go:128: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 kubectl -- apply -f ./testdata/ha/ha-pod-dns-test.yaml
ha_test.go:133: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 kubectl -- rollout status deployment/busybox
ha_test.go:133: (dbg) Done: out/minikube-linux-arm64 -p ha-569070 kubectl -- rollout status deployment/busybox: (5.137212396s)
ha_test.go:140: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 kubectl -- get pods -o jsonpath='{.items[*].status.podIP}'
ha_test.go:163: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 kubectl -- get pods -o jsonpath='{.items[*].metadata.name}'
ha_test.go:171: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 kubectl -- exec busybox-7b57f96db7-9nzqq -- nslookup kubernetes.io
ha_test.go:171: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 kubectl -- exec busybox-7b57f96db7-hcwb6 -- nslookup kubernetes.io
ha_test.go:171: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 kubectl -- exec busybox-7b57f96db7-qdlmt -- nslookup kubernetes.io
ha_test.go:181: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 kubectl -- exec busybox-7b57f96db7-9nzqq -- nslookup kubernetes.default
ha_test.go:181: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 kubectl -- exec busybox-7b57f96db7-hcwb6 -- nslookup kubernetes.default
ha_test.go:181: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 kubectl -- exec busybox-7b57f96db7-qdlmt -- nslookup kubernetes.default
ha_test.go:189: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 kubectl -- exec busybox-7b57f96db7-9nzqq -- nslookup kubernetes.default.svc.cluster.local
ha_test.go:189: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 kubectl -- exec busybox-7b57f96db7-hcwb6 -- nslookup kubernetes.default.svc.cluster.local
ha_test.go:189: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 kubectl -- exec busybox-7b57f96db7-qdlmt -- nslookup kubernetes.default.svc.cluster.local
--- PASS: TestMultiControlPlane/serial/DeployApp (8.20s)

                                                
                                    
x
+
TestMultiControlPlane/serial/PingHostFromPods (1.61s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/PingHostFromPods
ha_test.go:199: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 kubectl -- get pods -o jsonpath='{.items[*].metadata.name}'
ha_test.go:207: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 kubectl -- exec busybox-7b57f96db7-9nzqq -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
ha_test.go:218: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 kubectl -- exec busybox-7b57f96db7-9nzqq -- sh -c "ping -c 1 192.168.49.1"
ha_test.go:207: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 kubectl -- exec busybox-7b57f96db7-hcwb6 -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
ha_test.go:218: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 kubectl -- exec busybox-7b57f96db7-hcwb6 -- sh -c "ping -c 1 192.168.49.1"
ha_test.go:207: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 kubectl -- exec busybox-7b57f96db7-qdlmt -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
ha_test.go:218: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 kubectl -- exec busybox-7b57f96db7-qdlmt -- sh -c "ping -c 1 192.168.49.1"
--- PASS: TestMultiControlPlane/serial/PingHostFromPods (1.61s)

                                                
                                    
x
+
TestMultiControlPlane/serial/AddWorkerNode (60.54s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/AddWorkerNode
ha_test.go:228: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 node add --alsologtostderr -v 5
E1124 09:45:46.676352 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:45:58.376116 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:46:03.604824 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
ha_test.go:228: (dbg) Done: out/minikube-linux-arm64 -p ha-569070 node add --alsologtostderr -v 5: (59.411437329s)
ha_test.go:234: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 status --alsologtostderr -v 5
ha_test.go:234: (dbg) Done: out/minikube-linux-arm64 -p ha-569070 status --alsologtostderr -v 5: (1.124886398s)
--- PASS: TestMultiControlPlane/serial/AddWorkerNode (60.54s)

                                                
                                    
x
+
TestMultiControlPlane/serial/NodeLabels (0.11s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/NodeLabels
ha_test.go:255: (dbg) Run:  kubectl --context ha-569070 get nodes -o "jsonpath=[{range .items[*]}{.metadata.labels},{end}]"
--- PASS: TestMultiControlPlane/serial/NodeLabels (0.11s)

                                                
                                    
x
+
TestMultiControlPlane/serial/HAppyAfterClusterStart (1.11s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/HAppyAfterClusterStart
ha_test.go:281: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
ha_test.go:281: (dbg) Done: out/minikube-linux-arm64 profile list --output json: (1.106773571s)
--- PASS: TestMultiControlPlane/serial/HAppyAfterClusterStart (1.11s)

                                                
                                    
x
+
TestMultiControlPlane/serial/CopyFile (20.51s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/CopyFile
ha_test.go:328: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 status --output json --alsologtostderr -v 5
ha_test.go:328: (dbg) Done: out/minikube-linux-arm64 -p ha-569070 status --output json --alsologtostderr -v 5: (1.047440576s)
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 cp testdata/cp-test.txt ha-569070:/home/docker/cp-test.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 ssh -n ha-569070 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 cp ha-569070:/home/docker/cp-test.txt /tmp/TestMultiControlPlaneserialCopyFile2560202434/001/cp-test_ha-569070.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 ssh -n ha-569070 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 cp ha-569070:/home/docker/cp-test.txt ha-569070-m02:/home/docker/cp-test_ha-569070_ha-569070-m02.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 ssh -n ha-569070 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 ssh -n ha-569070-m02 "sudo cat /home/docker/cp-test_ha-569070_ha-569070-m02.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 cp ha-569070:/home/docker/cp-test.txt ha-569070-m03:/home/docker/cp-test_ha-569070_ha-569070-m03.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 ssh -n ha-569070 "sudo cat /home/docker/cp-test.txt"
E1124 09:46:24.717216 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 ssh -n ha-569070-m03 "sudo cat /home/docker/cp-test_ha-569070_ha-569070-m03.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 cp ha-569070:/home/docker/cp-test.txt ha-569070-m04:/home/docker/cp-test_ha-569070_ha-569070-m04.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 ssh -n ha-569070 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 ssh -n ha-569070-m04 "sudo cat /home/docker/cp-test_ha-569070_ha-569070-m04.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 cp testdata/cp-test.txt ha-569070-m02:/home/docker/cp-test.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 ssh -n ha-569070-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 cp ha-569070-m02:/home/docker/cp-test.txt /tmp/TestMultiControlPlaneserialCopyFile2560202434/001/cp-test_ha-569070-m02.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 ssh -n ha-569070-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 cp ha-569070-m02:/home/docker/cp-test.txt ha-569070:/home/docker/cp-test_ha-569070-m02_ha-569070.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 ssh -n ha-569070-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 ssh -n ha-569070 "sudo cat /home/docker/cp-test_ha-569070-m02_ha-569070.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 cp ha-569070-m02:/home/docker/cp-test.txt ha-569070-m03:/home/docker/cp-test_ha-569070-m02_ha-569070-m03.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 ssh -n ha-569070-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 ssh -n ha-569070-m03 "sudo cat /home/docker/cp-test_ha-569070-m02_ha-569070-m03.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 cp ha-569070-m02:/home/docker/cp-test.txt ha-569070-m04:/home/docker/cp-test_ha-569070-m02_ha-569070-m04.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 ssh -n ha-569070-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 ssh -n ha-569070-m04 "sudo cat /home/docker/cp-test_ha-569070-m02_ha-569070-m04.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 cp testdata/cp-test.txt ha-569070-m03:/home/docker/cp-test.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 ssh -n ha-569070-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 cp ha-569070-m03:/home/docker/cp-test.txt /tmp/TestMultiControlPlaneserialCopyFile2560202434/001/cp-test_ha-569070-m03.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 ssh -n ha-569070-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 cp ha-569070-m03:/home/docker/cp-test.txt ha-569070:/home/docker/cp-test_ha-569070-m03_ha-569070.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 ssh -n ha-569070-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 ssh -n ha-569070 "sudo cat /home/docker/cp-test_ha-569070-m03_ha-569070.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 cp ha-569070-m03:/home/docker/cp-test.txt ha-569070-m02:/home/docker/cp-test_ha-569070-m03_ha-569070-m02.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 ssh -n ha-569070-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 ssh -n ha-569070-m02 "sudo cat /home/docker/cp-test_ha-569070-m03_ha-569070-m02.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 cp ha-569070-m03:/home/docker/cp-test.txt ha-569070-m04:/home/docker/cp-test_ha-569070-m03_ha-569070-m04.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 ssh -n ha-569070-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 ssh -n ha-569070-m04 "sudo cat /home/docker/cp-test_ha-569070-m03_ha-569070-m04.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 cp testdata/cp-test.txt ha-569070-m04:/home/docker/cp-test.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 ssh -n ha-569070-m04 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 cp ha-569070-m04:/home/docker/cp-test.txt /tmp/TestMultiControlPlaneserialCopyFile2560202434/001/cp-test_ha-569070-m04.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 ssh -n ha-569070-m04 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 cp ha-569070-m04:/home/docker/cp-test.txt ha-569070:/home/docker/cp-test_ha-569070-m04_ha-569070.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 ssh -n ha-569070-m04 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 ssh -n ha-569070 "sudo cat /home/docker/cp-test_ha-569070-m04_ha-569070.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 cp ha-569070-m04:/home/docker/cp-test.txt ha-569070-m02:/home/docker/cp-test_ha-569070-m04_ha-569070-m02.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 ssh -n ha-569070-m04 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 ssh -n ha-569070-m02 "sudo cat /home/docker/cp-test_ha-569070-m04_ha-569070-m02.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 cp ha-569070-m04:/home/docker/cp-test.txt ha-569070-m03:/home/docker/cp-test_ha-569070-m04_ha-569070-m03.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 ssh -n ha-569070-m04 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 ssh -n ha-569070-m03 "sudo cat /home/docker/cp-test_ha-569070-m04_ha-569070-m03.txt"
--- PASS: TestMultiControlPlane/serial/CopyFile (20.51s)

                                                
                                    
x
+
TestMultiControlPlane/serial/StopSecondaryNode (13s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/StopSecondaryNode
ha_test.go:365: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 node stop m02 --alsologtostderr -v 5
ha_test.go:365: (dbg) Done: out/minikube-linux-arm64 -p ha-569070 node stop m02 --alsologtostderr -v 5: (12.239320336s)
ha_test.go:371: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 status --alsologtostderr -v 5
ha_test.go:371: (dbg) Non-zero exit: out/minikube-linux-arm64 -p ha-569070 status --alsologtostderr -v 5: exit status 7 (762.832339ms)

                                                
                                                
-- stdout --
	ha-569070
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-569070-m02
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	ha-569070-m03
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-569070-m04
	type: Worker
	host: Running
	kubelet: Running
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1124 09:46:53.251941 1744464 out.go:360] Setting OutFile to fd 1 ...
	I1124 09:46:53.252168 1744464 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:46:53.252200 1744464 out.go:374] Setting ErrFile to fd 2...
	I1124 09:46:53.252221 1744464 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:46:53.252560 1744464 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
	I1124 09:46:53.252865 1744464 out.go:368] Setting JSON to false
	I1124 09:46:53.253089 1744464 mustload.go:66] Loading cluster: ha-569070
	I1124 09:46:53.253553 1744464 config.go:182] Loaded profile config "ha-569070": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
	I1124 09:46:53.253596 1744464 status.go:174] checking status of ha-569070 ...
	I1124 09:46:53.254212 1744464 cli_runner.go:164] Run: docker container inspect ha-569070 --format={{.State.Status}}
	I1124 09:46:53.254571 1744464 notify.go:221] Checking for updates...
	I1124 09:46:53.275685 1744464 status.go:371] ha-569070 host status = "Running" (err=<nil>)
	I1124 09:46:53.275708 1744464 host.go:66] Checking if "ha-569070" exists ...
	I1124 09:46:53.276016 1744464 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" ha-569070
	I1124 09:46:53.306572 1744464 host.go:66] Checking if "ha-569070" exists ...
	I1124 09:46:53.306881 1744464 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1124 09:46:53.306973 1744464 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" ha-569070
	I1124 09:46:53.327609 1744464 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34689 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/ha-569070/id_rsa Username:docker}
	I1124 09:46:53.440491 1744464 ssh_runner.go:195] Run: systemctl --version
	I1124 09:46:53.448013 1744464 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1124 09:46:53.463322 1744464 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 09:46:53.524880 1744464 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:4 ContainersRunning:3 ContainersPaused:0 ContainersStopped:1 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:69 OomKillDisable:true NGoroutines:72 SystemTime:2025-11-24 09:46:53.514977267 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 09:46:53.525447 1744464 kubeconfig.go:125] found "ha-569070" server: "https://192.168.49.254:8443"
	I1124 09:46:53.525500 1744464 api_server.go:166] Checking apiserver status ...
	I1124 09:46:53.525550 1744464 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:46:53.537761 1744464 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1419/cgroup
	I1124 09:46:53.546001 1744464 api_server.go:182] apiserver freezer: "11:freezer:/docker/e60cb8e8f0f77529a8e5723b24482eaf6cf62b4c664a2ee6eb3d86104c187ba9/kubepods/burstable/podd23e6749c23e3a04dd68738eeba45ecf/a0a068550eea853f53bb1bc803706bc0bdaee1c3fdee542aa72689a4f6fce34c"
	I1124 09:46:53.546092 1744464 ssh_runner.go:195] Run: sudo cat /sys/fs/cgroup/freezer/docker/e60cb8e8f0f77529a8e5723b24482eaf6cf62b4c664a2ee6eb3d86104c187ba9/kubepods/burstable/podd23e6749c23e3a04dd68738eeba45ecf/a0a068550eea853f53bb1bc803706bc0bdaee1c3fdee542aa72689a4f6fce34c/freezer.state
	I1124 09:46:53.554418 1744464 api_server.go:204] freezer state: "THAWED"
	I1124 09:46:53.554448 1744464 api_server.go:253] Checking apiserver healthz at https://192.168.49.254:8443/healthz ...
	I1124 09:46:53.563038 1744464 api_server.go:279] https://192.168.49.254:8443/healthz returned 200:
	ok
	I1124 09:46:53.563070 1744464 status.go:463] ha-569070 apiserver status = Running (err=<nil>)
	I1124 09:46:53.563083 1744464 status.go:176] ha-569070 status: &{Name:ha-569070 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1124 09:46:53.563100 1744464 status.go:174] checking status of ha-569070-m02 ...
	I1124 09:46:53.563414 1744464 cli_runner.go:164] Run: docker container inspect ha-569070-m02 --format={{.State.Status}}
	I1124 09:46:53.580574 1744464 status.go:371] ha-569070-m02 host status = "Stopped" (err=<nil>)
	I1124 09:46:53.580600 1744464 status.go:384] host is not running, skipping remaining checks
	I1124 09:46:53.580608 1744464 status.go:176] ha-569070-m02 status: &{Name:ha-569070-m02 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1124 09:46:53.580628 1744464 status.go:174] checking status of ha-569070-m03 ...
	I1124 09:46:53.580947 1744464 cli_runner.go:164] Run: docker container inspect ha-569070-m03 --format={{.State.Status}}
	I1124 09:46:53.597795 1744464 status.go:371] ha-569070-m03 host status = "Running" (err=<nil>)
	I1124 09:46:53.597821 1744464 host.go:66] Checking if "ha-569070-m03" exists ...
	I1124 09:46:53.598132 1744464 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" ha-569070-m03
	I1124 09:46:53.615235 1744464 host.go:66] Checking if "ha-569070-m03" exists ...
	I1124 09:46:53.615542 1744464 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1124 09:46:53.615593 1744464 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" ha-569070-m03
	I1124 09:46:53.632684 1744464 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34699 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/ha-569070-m03/id_rsa Username:docker}
	I1124 09:46:53.735828 1744464 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1124 09:46:53.749142 1744464 kubeconfig.go:125] found "ha-569070" server: "https://192.168.49.254:8443"
	I1124 09:46:53.749173 1744464 api_server.go:166] Checking apiserver status ...
	I1124 09:46:53.749237 1744464 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 09:46:53.761581 1744464 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1346/cgroup
	I1124 09:46:53.770107 1744464 api_server.go:182] apiserver freezer: "11:freezer:/docker/5ab2bb82ddfb34069bdb220aaec3dd2f8b3977d171b3bad31fac5a191ac419de/kubepods/burstable/poda6f0fb21befa3583856495437d7643f0/7901f3eceef3705f3968983bd31d078c74985557520f4573e5098d4d879fd52c"
	I1124 09:46:53.770222 1744464 ssh_runner.go:195] Run: sudo cat /sys/fs/cgroup/freezer/docker/5ab2bb82ddfb34069bdb220aaec3dd2f8b3977d171b3bad31fac5a191ac419de/kubepods/burstable/poda6f0fb21befa3583856495437d7643f0/7901f3eceef3705f3968983bd31d078c74985557520f4573e5098d4d879fd52c/freezer.state
	I1124 09:46:53.777817 1744464 api_server.go:204] freezer state: "THAWED"
	I1124 09:46:53.777848 1744464 api_server.go:253] Checking apiserver healthz at https://192.168.49.254:8443/healthz ...
	I1124 09:46:53.786231 1744464 api_server.go:279] https://192.168.49.254:8443/healthz returned 200:
	ok
	I1124 09:46:53.786278 1744464 status.go:463] ha-569070-m03 apiserver status = Running (err=<nil>)
	I1124 09:46:53.786289 1744464 status.go:176] ha-569070-m03 status: &{Name:ha-569070-m03 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1124 09:46:53.786308 1744464 status.go:174] checking status of ha-569070-m04 ...
	I1124 09:46:53.786727 1744464 cli_runner.go:164] Run: docker container inspect ha-569070-m04 --format={{.State.Status}}
	I1124 09:46:53.803723 1744464 status.go:371] ha-569070-m04 host status = "Running" (err=<nil>)
	I1124 09:46:53.803748 1744464 host.go:66] Checking if "ha-569070-m04" exists ...
	I1124 09:46:53.804069 1744464 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" ha-569070-m04
	I1124 09:46:53.825015 1744464 host.go:66] Checking if "ha-569070-m04" exists ...
	I1124 09:46:53.825339 1744464 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1124 09:46:53.825384 1744464 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" ha-569070-m04
	I1124 09:46:53.842761 1744464 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/ha-569070-m04/id_rsa Username:docker}
	I1124 09:46:53.947666 1744464 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1124 09:46:53.961612 1744464 status.go:176] ha-569070-m04 status: &{Name:ha-569070-m04 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiControlPlane/serial/StopSecondaryNode (13.00s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop (0.85s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop
ha_test.go:392: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
--- PASS: TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop (0.85s)

                                                
                                    
x
+
TestMultiControlPlane/serial/RestartSecondaryNode (15.34s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/RestartSecondaryNode
ha_test.go:422: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 node start m02 --alsologtostderr -v 5
ha_test.go:422: (dbg) Done: out/minikube-linux-arm64 -p ha-569070 node start m02 --alsologtostderr -v 5: (14.110462947s)
ha_test.go:430: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 status --alsologtostderr -v 5
ha_test.go:430: (dbg) Done: out/minikube-linux-arm64 -p ha-569070 status --alsologtostderr -v 5: (1.085008704s)
ha_test.go:450: (dbg) Run:  kubectl get nodes
--- PASS: TestMultiControlPlane/serial/RestartSecondaryNode (15.34s)

                                                
                                    
x
+
TestMultiControlPlane/serial/HAppyAfterSecondaryNodeRestart (1.05s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/HAppyAfterSecondaryNodeRestart
ha_test.go:281: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
ha_test.go:281: (dbg) Done: out/minikube-linux-arm64 profile list --output json: (1.045188649s)
--- PASS: TestMultiControlPlane/serial/HAppyAfterSecondaryNodeRestart (1.05s)

                                                
                                    
x
+
TestMultiControlPlane/serial/RestartClusterKeepsNodes (94.49s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/RestartClusterKeepsNodes
ha_test.go:458: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 node list --alsologtostderr -v 5
ha_test.go:464: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 stop --alsologtostderr -v 5
ha_test.go:464: (dbg) Done: out/minikube-linux-arm64 -p ha-569070 stop --alsologtostderr -v 5: (31.809945171s)
ha_test.go:469: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 start --wait true --alsologtostderr -v 5
E1124 09:48:14.515264 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:48:42.218975 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
ha_test.go:469: (dbg) Done: out/minikube-linux-arm64 -p ha-569070 start --wait true --alsologtostderr -v 5: (1m2.52812945s)
ha_test.go:474: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 node list --alsologtostderr -v 5
--- PASS: TestMultiControlPlane/serial/RestartClusterKeepsNodes (94.49s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DeleteSecondaryNode (11.15s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DeleteSecondaryNode
ha_test.go:489: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 node delete m03 --alsologtostderr -v 5
ha_test.go:489: (dbg) Done: out/minikube-linux-arm64 -p ha-569070 node delete m03 --alsologtostderr -v 5: (10.153100845s)
ha_test.go:495: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 status --alsologtostderr -v 5
ha_test.go:513: (dbg) Run:  kubectl get nodes
ha_test.go:521: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiControlPlane/serial/DeleteSecondaryNode (11.15s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete (0.84s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete
ha_test.go:392: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
--- PASS: TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete (0.84s)

                                                
                                    
x
+
TestMultiControlPlane/serial/StopCluster (25.75s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/StopCluster
ha_test.go:533: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 stop --alsologtostderr -v 5
ha_test.go:533: (dbg) Done: out/minikube-linux-arm64 -p ha-569070 stop --alsologtostderr -v 5: (25.637260111s)
ha_test.go:539: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 status --alsologtostderr -v 5
ha_test.go:539: (dbg) Non-zero exit: out/minikube-linux-arm64 -p ha-569070 status --alsologtostderr -v 5: exit status 7 (116.806076ms)

                                                
                                                
-- stdout --
	ha-569070
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	ha-569070-m02
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	ha-569070-m04
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1124 09:49:23.359709 1759163 out.go:360] Setting OutFile to fd 1 ...
	I1124 09:49:23.359894 1759163 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:49:23.359919 1759163 out.go:374] Setting ErrFile to fd 2...
	I1124 09:49:23.359939 1759163 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 09:49:23.360695 1759163 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
	I1124 09:49:23.360955 1759163 out.go:368] Setting JSON to false
	I1124 09:49:23.361018 1759163 mustload.go:66] Loading cluster: ha-569070
	I1124 09:49:23.361115 1759163 notify.go:221] Checking for updates...
	I1124 09:49:23.361472 1759163 config.go:182] Loaded profile config "ha-569070": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
	I1124 09:49:23.361518 1759163 status.go:174] checking status of ha-569070 ...
	I1124 09:49:23.362065 1759163 cli_runner.go:164] Run: docker container inspect ha-569070 --format={{.State.Status}}
	I1124 09:49:23.380727 1759163 status.go:371] ha-569070 host status = "Stopped" (err=<nil>)
	I1124 09:49:23.380747 1759163 status.go:384] host is not running, skipping remaining checks
	I1124 09:49:23.380759 1759163 status.go:176] ha-569070 status: &{Name:ha-569070 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1124 09:49:23.380787 1759163 status.go:174] checking status of ha-569070-m02 ...
	I1124 09:49:23.381091 1759163 cli_runner.go:164] Run: docker container inspect ha-569070-m02 --format={{.State.Status}}
	I1124 09:49:23.403851 1759163 status.go:371] ha-569070-m02 host status = "Stopped" (err=<nil>)
	I1124 09:49:23.403875 1759163 status.go:384] host is not running, skipping remaining checks
	I1124 09:49:23.403882 1759163 status.go:176] ha-569070-m02 status: &{Name:ha-569070-m02 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1124 09:49:23.403901 1759163 status.go:174] checking status of ha-569070-m04 ...
	I1124 09:49:23.404196 1759163 cli_runner.go:164] Run: docker container inspect ha-569070-m04 --format={{.State.Status}}
	I1124 09:49:23.425722 1759163 status.go:371] ha-569070-m04 host status = "Stopped" (err=<nil>)
	I1124 09:49:23.425743 1759163 status.go:384] host is not running, skipping remaining checks
	I1124 09:49:23.425750 1759163 status.go:176] ha-569070-m04 status: &{Name:ha-569070-m04 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiControlPlane/serial/StopCluster (25.75s)

                                                
                                    
x
+
TestMultiControlPlane/serial/RestartCluster (61.29s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/RestartCluster
ha_test.go:562: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 start --wait true --alsologtostderr -v 5 --driver=docker  --container-runtime=containerd
ha_test.go:562: (dbg) Done: out/minikube-linux-arm64 -p ha-569070 start --wait true --alsologtostderr -v 5 --driver=docker  --container-runtime=containerd: (1m0.309125429s)
ha_test.go:568: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 status --alsologtostderr -v 5
ha_test.go:586: (dbg) Run:  kubectl get nodes
ha_test.go:594: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiControlPlane/serial/RestartCluster (61.29s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DegradedAfterClusterRestart (0.84s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DegradedAfterClusterRestart
ha_test.go:392: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
--- PASS: TestMultiControlPlane/serial/DegradedAfterClusterRestart (0.84s)

                                                
                                    
x
+
TestMultiControlPlane/serial/AddSecondaryNode (83.83s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/AddSecondaryNode
ha_test.go:607: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 node add --control-plane --alsologtostderr -v 5
E1124 09:51:03.604477 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 09:51:24.717620 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
ha_test.go:607: (dbg) Done: out/minikube-linux-arm64 -p ha-569070 node add --control-plane --alsologtostderr -v 5: (1m22.732734261s)
ha_test.go:613: (dbg) Run:  out/minikube-linux-arm64 -p ha-569070 status --alsologtostderr -v 5
ha_test.go:613: (dbg) Done: out/minikube-linux-arm64 -p ha-569070 status --alsologtostderr -v 5: (1.099752912s)
--- PASS: TestMultiControlPlane/serial/AddSecondaryNode (83.83s)

                                                
                                    
x
+
TestMultiControlPlane/serial/HAppyAfterSecondaryNodeAdd (1.12s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/HAppyAfterSecondaryNodeAdd
ha_test.go:281: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
ha_test.go:281: (dbg) Done: out/minikube-linux-arm64 profile list --output json: (1.124336549s)
--- PASS: TestMultiControlPlane/serial/HAppyAfterSecondaryNodeAdd (1.12s)

                                                
                                    
x
+
TestJSONOutput/start/Command (83.65s)

                                                
                                                
=== RUN   TestJSONOutput/start/Command
json_output_test.go:63: (dbg) Run:  out/minikube-linux-arm64 start -p json-output-190042 --output=json --user=testUser --memory=3072 --wait=true --driver=docker  --container-runtime=containerd
E1124 09:53:14.515121 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
json_output_test.go:63: (dbg) Done: out/minikube-linux-arm64 start -p json-output-190042 --output=json --user=testUser --memory=3072 --wait=true --driver=docker  --container-runtime=containerd: (1m23.643274052s)
--- PASS: TestJSONOutput/start/Command (83.65s)

                                                
                                    
x
+
TestJSONOutput/start/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/Audit
--- PASS: TestJSONOutput/start/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/start/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/start/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/start/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/start/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/start/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/start/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/start/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/start/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/Command (0.77s)

                                                
                                                
=== RUN   TestJSONOutput/pause/Command
json_output_test.go:63: (dbg) Run:  out/minikube-linux-arm64 pause -p json-output-190042 --output=json --user=testUser
--- PASS: TestJSONOutput/pause/Command (0.77s)

                                                
                                    
x
+
TestJSONOutput/pause/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/Audit
--- PASS: TestJSONOutput/pause/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/pause/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/pause/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/pause/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/pause/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/pause/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/pause/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/Command (0.61s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/Command
json_output_test.go:63: (dbg) Run:  out/minikube-linux-arm64 unpause -p json-output-190042 --output=json --user=testUser
--- PASS: TestJSONOutput/unpause/Command (0.61s)

                                                
                                    
x
+
TestJSONOutput/unpause/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/Audit
--- PASS: TestJSONOutput/unpause/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/unpause/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/unpause/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/unpause/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/unpause/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/unpause/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/unpause/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/Command (5.99s)

                                                
                                                
=== RUN   TestJSONOutput/stop/Command
json_output_test.go:63: (dbg) Run:  out/minikube-linux-arm64 stop -p json-output-190042 --output=json --user=testUser
json_output_test.go:63: (dbg) Done: out/minikube-linux-arm64 stop -p json-output-190042 --output=json --user=testUser: (5.99229081s)
--- PASS: TestJSONOutput/stop/Command (5.99s)

                                                
                                    
x
+
TestJSONOutput/stop/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/Audit
--- PASS: TestJSONOutput/stop/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/stop/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/stop/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/stop/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/stop/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/stop/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/stop/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestErrorJSONOutput (0.25s)

                                                
                                                
=== RUN   TestErrorJSONOutput
json_output_test.go:160: (dbg) Run:  out/minikube-linux-arm64 start -p json-output-error-898209 --memory=3072 --output=json --wait=true --driver=fail
json_output_test.go:160: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p json-output-error-898209 --memory=3072 --output=json --wait=true --driver=fail: exit status 56 (100.492968ms)

                                                
                                                
-- stdout --
	{"specversion":"1.0","id":"60245c1d-abcc-4ec2-8b8d-157f93ae4f12","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"0","message":"[json-output-error-898209] minikube v1.37.0 on Ubuntu 20.04 (arm64)","name":"Initial Minikube Setup","totalsteps":"19"}}
	{"specversion":"1.0","id":"df00aebb-2d1d-4ddf-81bd-2611105c5f2c","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_LOCATION=21978"}}
	{"specversion":"1.0","id":"684aeea2-de4e-46b8-92a0-f90fabc21176","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true"}}
	{"specversion":"1.0","id":"d961af72-a8e7-4cd2-9742-6934c148680f","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"KUBECONFIG=/home/jenkins/minikube-integration/21978-1652607/kubeconfig"}}
	{"specversion":"1.0","id":"d205c95f-8e26-4f39-bb44-b5e1200826d9","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_HOME=/home/jenkins/minikube-integration/21978-1652607/.minikube"}}
	{"specversion":"1.0","id":"bc470629-3e2f-42ca-90b0-749c4facc6a5","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_BIN=out/minikube-linux-arm64"}}
	{"specversion":"1.0","id":"f9baa7b6-0bf7-485a-85ac-044508acb18b","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_FORCE_SYSTEMD="}}
	{"specversion":"1.0","id":"63d40b13-c586-4280-9bc6-26cb334dc1e7","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.error","datacontenttype":"application/json","data":{"advice":"","exitcode":"56","issues":"","message":"The driver 'fail' is not supported on linux/arm64","name":"DRV_UNSUPPORTED_OS","url":""}}

                                                
                                                
-- /stdout --
helpers_test.go:175: Cleaning up "json-output-error-898209" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p json-output-error-898209
--- PASS: TestErrorJSONOutput (0.25s)

                                                
                                    
x
+
TestKicCustomNetwork/create_custom_network (41.57s)

                                                
                                                
=== RUN   TestKicCustomNetwork/create_custom_network
kic_custom_network_test.go:57: (dbg) Run:  out/minikube-linux-arm64 start -p docker-network-952541 --network=
kic_custom_network_test.go:57: (dbg) Done: out/minikube-linux-arm64 start -p docker-network-952541 --network=: (39.316356929s)
kic_custom_network_test.go:150: (dbg) Run:  docker network ls --format {{.Name}}
helpers_test.go:175: Cleaning up "docker-network-952541" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p docker-network-952541
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p docker-network-952541: (2.232596847s)
--- PASS: TestKicCustomNetwork/create_custom_network (41.57s)

                                                
                                    
x
+
TestKicCustomNetwork/use_default_bridge_network (35.99s)

                                                
                                                
=== RUN   TestKicCustomNetwork/use_default_bridge_network
kic_custom_network_test.go:57: (dbg) Run:  out/minikube-linux-arm64 start -p docker-network-380810 --network=bridge
kic_custom_network_test.go:57: (dbg) Done: out/minikube-linux-arm64 start -p docker-network-380810 --network=bridge: (33.859797112s)
kic_custom_network_test.go:150: (dbg) Run:  docker network ls --format {{.Name}}
helpers_test.go:175: Cleaning up "docker-network-380810" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p docker-network-380810
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p docker-network-380810: (2.104906331s)
--- PASS: TestKicCustomNetwork/use_default_bridge_network (35.99s)

                                                
                                    
x
+
TestKicExistingNetwork (38.85s)

                                                
                                                
=== RUN   TestKicExistingNetwork
I1124 09:54:52.954884 1654467 cli_runner.go:164] Run: docker network inspect existing-network --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
W1124 09:54:52.971554 1654467 cli_runner.go:211] docker network inspect existing-network --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}" returned with exit code 1
I1124 09:54:52.971640 1654467 network_create.go:284] running [docker network inspect existing-network] to gather additional debugging logs...
I1124 09:54:52.971656 1654467 cli_runner.go:164] Run: docker network inspect existing-network
W1124 09:54:52.999118 1654467 cli_runner.go:211] docker network inspect existing-network returned with exit code 1
I1124 09:54:52.999149 1654467 network_create.go:287] error running [docker network inspect existing-network]: docker network inspect existing-network: exit status 1
stdout:
[]

                                                
                                                
stderr:
Error response from daemon: network existing-network not found
I1124 09:54:52.999166 1654467 network_create.go:289] output of [docker network inspect existing-network]: -- stdout --
[]

                                                
                                                
-- /stdout --
** stderr ** 
Error response from daemon: network existing-network not found

                                                
                                                
** /stderr **
I1124 09:54:52.999282 1654467 cli_runner.go:164] Run: docker network inspect bridge --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
I1124 09:54:53.019158 1654467 network.go:211] skipping subnet 192.168.49.0/24 that is taken: &{IP:192.168.49.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.49.0/24 Gateway:192.168.49.1 ClientMin:192.168.49.2 ClientMax:192.168.49.254 Broadcast:192.168.49.255 IsPrivate:true Interface:{IfaceName:br-150d0a6dddcd IfaceIPv4:192.168.49.1 IfaceMTU:1500 IfaceMAC:e2:5f:fe:86:6f:c9} reservation:<nil>}
I1124 09:54:53.019497 1654467 network.go:206] using free private subnet 192.168.58.0/24: &{IP:192.168.58.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.58.0/24 Gateway:192.168.58.1 ClientMin:192.168.58.2 ClientMax:192.168.58.254 Broadcast:192.168.58.255 IsPrivate:true Interface:{IfaceName: IfaceIPv4: IfaceMTU:0 IfaceMAC:} reservation:0x4001f232a0}
I1124 09:54:53.019521 1654467 network_create.go:124] attempt to create docker network existing-network 192.168.58.0/24 with gateway 192.168.58.1 and MTU of 1500 ...
I1124 09:54:53.019575 1654467 cli_runner.go:164] Run: docker network create --driver=bridge --subnet=192.168.58.0/24 --gateway=192.168.58.1 -o --ip-masq -o --icc -o com.docker.network.driver.mtu=1500 --label=created_by.minikube.sigs.k8s.io=true --label=name.minikube.sigs.k8s.io=existing-network existing-network
I1124 09:54:53.078783 1654467 network_create.go:108] docker network existing-network 192.168.58.0/24 created
kic_custom_network_test.go:150: (dbg) Run:  docker network ls --format {{.Name}}
kic_custom_network_test.go:93: (dbg) Run:  out/minikube-linux-arm64 start -p existing-network-809702 --network=existing-network
kic_custom_network_test.go:93: (dbg) Done: out/minikube-linux-arm64 start -p existing-network-809702 --network=existing-network: (36.499240852s)
helpers_test.go:175: Cleaning up "existing-network-809702" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p existing-network-809702
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p existing-network-809702: (2.189647715s)
I1124 09:55:31.784632 1654467 cli_runner.go:164] Run: docker network ls --filter=label=existing-network --format {{.Name}}
--- PASS: TestKicExistingNetwork (38.85s)

                                                
                                    
x
+
TestKicCustomSubnet (37.65s)

                                                
                                                
=== RUN   TestKicCustomSubnet
kic_custom_network_test.go:112: (dbg) Run:  out/minikube-linux-arm64 start -p custom-subnet-077771 --subnet=192.168.60.0/24
E1124 09:56:03.605071 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
kic_custom_network_test.go:112: (dbg) Done: out/minikube-linux-arm64 start -p custom-subnet-077771 --subnet=192.168.60.0/24: (35.316161598s)
kic_custom_network_test.go:161: (dbg) Run:  docker network inspect custom-subnet-077771 --format "{{(index .IPAM.Config 0).Subnet}}"
helpers_test.go:175: Cleaning up "custom-subnet-077771" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p custom-subnet-077771
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p custom-subnet-077771: (2.305555713s)
--- PASS: TestKicCustomSubnet (37.65s)

                                                
                                    
x
+
TestKicStaticIP (39.09s)

                                                
                                                
=== RUN   TestKicStaticIP
kic_custom_network_test.go:132: (dbg) Run:  out/minikube-linux-arm64 start -p static-ip-318213 --static-ip=192.168.200.200
E1124 09:56:24.716989 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
kic_custom_network_test.go:132: (dbg) Done: out/minikube-linux-arm64 start -p static-ip-318213 --static-ip=192.168.200.200: (36.734639898s)
kic_custom_network_test.go:138: (dbg) Run:  out/minikube-linux-arm64 -p static-ip-318213 ip
helpers_test.go:175: Cleaning up "static-ip-318213" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p static-ip-318213
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p static-ip-318213: (2.195581859s)
--- PASS: TestKicStaticIP (39.09s)

                                                
                                    
x
+
TestMainNoArgs (0.05s)

                                                
                                                
=== RUN   TestMainNoArgs
main_test.go:70: (dbg) Run:  out/minikube-linux-arm64
--- PASS: TestMainNoArgs (0.05s)

                                                
                                    
x
+
TestMinikubeProfile (72.54s)

                                                
                                                
=== RUN   TestMinikubeProfile
minikube_profile_test.go:44: (dbg) Run:  out/minikube-linux-arm64 start -p first-726093 --driver=docker  --container-runtime=containerd
minikube_profile_test.go:44: (dbg) Done: out/minikube-linux-arm64 start -p first-726093 --driver=docker  --container-runtime=containerd: (31.171056744s)
minikube_profile_test.go:44: (dbg) Run:  out/minikube-linux-arm64 start -p second-729071 --driver=docker  --container-runtime=containerd
minikube_profile_test.go:44: (dbg) Done: out/minikube-linux-arm64 start -p second-729071 --driver=docker  --container-runtime=containerd: (35.083371434s)
minikube_profile_test.go:51: (dbg) Run:  out/minikube-linux-arm64 profile first-726093
minikube_profile_test.go:55: (dbg) Run:  out/minikube-linux-arm64 profile list -ojson
minikube_profile_test.go:51: (dbg) Run:  out/minikube-linux-arm64 profile second-729071
minikube_profile_test.go:55: (dbg) Run:  out/minikube-linux-arm64 profile list -ojson
helpers_test.go:175: Cleaning up "second-729071" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p second-729071
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p second-729071: (2.143219233s)
helpers_test.go:175: Cleaning up "first-726093" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p first-726093
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p first-726093: (2.676572729s)
--- PASS: TestMinikubeProfile (72.54s)

                                                
                                    
x
+
TestMountStart/serial/StartWithMountFirst (8.3s)

                                                
                                                
=== RUN   TestMountStart/serial/StartWithMountFirst
mount_start_test.go:118: (dbg) Run:  out/minikube-linux-arm64 start -p mount-start-1-498014 --memory=3072 --mount-string /tmp/TestMountStartserial681140800/001:/minikube-host --mount-gid 0 --mount-msize 6543 --mount-port 46464 --mount-uid 0 --no-kubernetes --driver=docker  --container-runtime=containerd
mount_start_test.go:118: (dbg) Done: out/minikube-linux-arm64 start -p mount-start-1-498014 --memory=3072 --mount-string /tmp/TestMountStartserial681140800/001:/minikube-host --mount-gid 0 --mount-msize 6543 --mount-port 46464 --mount-uid 0 --no-kubernetes --driver=docker  --container-runtime=containerd: (7.298623645s)
--- PASS: TestMountStart/serial/StartWithMountFirst (8.30s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountFirst (0.27s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountFirst
mount_start_test.go:134: (dbg) Run:  out/minikube-linux-arm64 -p mount-start-1-498014 ssh -- ls /minikube-host
--- PASS: TestMountStart/serial/VerifyMountFirst (0.27s)

                                                
                                    
x
+
TestMountStart/serial/StartWithMountSecond (8.43s)

                                                
                                                
=== RUN   TestMountStart/serial/StartWithMountSecond
mount_start_test.go:118: (dbg) Run:  out/minikube-linux-arm64 start -p mount-start-2-499901 --memory=3072 --mount-string /tmp/TestMountStartserial681140800/001:/minikube-host --mount-gid 0 --mount-msize 6543 --mount-port 46465 --mount-uid 0 --no-kubernetes --driver=docker  --container-runtime=containerd
E1124 09:58:14.515673 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
mount_start_test.go:118: (dbg) Done: out/minikube-linux-arm64 start -p mount-start-2-499901 --memory=3072 --mount-string /tmp/TestMountStartserial681140800/001:/minikube-host --mount-gid 0 --mount-msize 6543 --mount-port 46465 --mount-uid 0 --no-kubernetes --driver=docker  --container-runtime=containerd: (7.427896879s)
--- PASS: TestMountStart/serial/StartWithMountSecond (8.43s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountSecond (0.28s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountSecond
mount_start_test.go:134: (dbg) Run:  out/minikube-linux-arm64 -p mount-start-2-499901 ssh -- ls /minikube-host
--- PASS: TestMountStart/serial/VerifyMountSecond (0.28s)

                                                
                                    
x
+
TestMountStart/serial/DeleteFirst (1.73s)

                                                
                                                
=== RUN   TestMountStart/serial/DeleteFirst
pause_test.go:132: (dbg) Run:  out/minikube-linux-arm64 delete -p mount-start-1-498014 --alsologtostderr -v=5
pause_test.go:132: (dbg) Done: out/minikube-linux-arm64 delete -p mount-start-1-498014 --alsologtostderr -v=5: (1.72752439s)
--- PASS: TestMountStart/serial/DeleteFirst (1.73s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountPostDelete (0.28s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountPostDelete
mount_start_test.go:134: (dbg) Run:  out/minikube-linux-arm64 -p mount-start-2-499901 ssh -- ls /minikube-host
--- PASS: TestMountStart/serial/VerifyMountPostDelete (0.28s)

                                                
                                    
x
+
TestMountStart/serial/Stop (1.29s)

                                                
                                                
=== RUN   TestMountStart/serial/Stop
mount_start_test.go:196: (dbg) Run:  out/minikube-linux-arm64 stop -p mount-start-2-499901
mount_start_test.go:196: (dbg) Done: out/minikube-linux-arm64 stop -p mount-start-2-499901: (1.292031808s)
--- PASS: TestMountStart/serial/Stop (1.29s)

                                                
                                    
x
+
TestMountStart/serial/RestartStopped (7.33s)

                                                
                                                
=== RUN   TestMountStart/serial/RestartStopped
mount_start_test.go:207: (dbg) Run:  out/minikube-linux-arm64 start -p mount-start-2-499901
mount_start_test.go:207: (dbg) Done: out/minikube-linux-arm64 start -p mount-start-2-499901: (6.331754462s)
--- PASS: TestMountStart/serial/RestartStopped (7.33s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountPostStop (0.28s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountPostStop
mount_start_test.go:134: (dbg) Run:  out/minikube-linux-arm64 -p mount-start-2-499901 ssh -- ls /minikube-host
--- PASS: TestMountStart/serial/VerifyMountPostStop (0.28s)

                                                
                                    
x
+
TestMultiNode/serial/FreshStart2Nodes (111.65s)

                                                
                                                
=== RUN   TestMultiNode/serial/FreshStart2Nodes
multinode_test.go:96: (dbg) Run:  out/minikube-linux-arm64 start -p multinode-309580 --wait=true --memory=3072 --nodes=2 -v=5 --alsologtostderr --driver=docker  --container-runtime=containerd
E1124 09:59:37.583783 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
multinode_test.go:96: (dbg) Done: out/minikube-linux-arm64 start -p multinode-309580 --wait=true --memory=3072 --nodes=2 -v=5 --alsologtostderr --driver=docker  --container-runtime=containerd: (1m51.089130088s)
multinode_test.go:102: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 status --alsologtostderr
--- PASS: TestMultiNode/serial/FreshStart2Nodes (111.65s)

                                                
                                    
x
+
TestMultiNode/serial/DeployApp2Nodes (4.82s)

                                                
                                                
=== RUN   TestMultiNode/serial/DeployApp2Nodes
multinode_test.go:493: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-309580 -- apply -f ./testdata/multinodes/multinode-pod-dns-test.yaml
multinode_test.go:498: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-309580 -- rollout status deployment/busybox
multinode_test.go:498: (dbg) Done: out/minikube-linux-arm64 kubectl -p multinode-309580 -- rollout status deployment/busybox: (2.962681371s)
multinode_test.go:505: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-309580 -- get pods -o jsonpath='{.items[*].status.podIP}'
multinode_test.go:528: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-309580 -- get pods -o jsonpath='{.items[*].metadata.name}'
multinode_test.go:536: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-309580 -- exec busybox-7b57f96db7-6zvv6 -- nslookup kubernetes.io
multinode_test.go:536: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-309580 -- exec busybox-7b57f96db7-b67nf -- nslookup kubernetes.io
multinode_test.go:546: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-309580 -- exec busybox-7b57f96db7-6zvv6 -- nslookup kubernetes.default
multinode_test.go:546: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-309580 -- exec busybox-7b57f96db7-b67nf -- nslookup kubernetes.default
multinode_test.go:554: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-309580 -- exec busybox-7b57f96db7-6zvv6 -- nslookup kubernetes.default.svc.cluster.local
multinode_test.go:554: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-309580 -- exec busybox-7b57f96db7-b67nf -- nslookup kubernetes.default.svc.cluster.local
--- PASS: TestMultiNode/serial/DeployApp2Nodes (4.82s)

                                                
                                    
x
+
TestMultiNode/serial/PingHostFrom2Pods (0.96s)

                                                
                                                
=== RUN   TestMultiNode/serial/PingHostFrom2Pods
multinode_test.go:564: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-309580 -- get pods -o jsonpath='{.items[*].metadata.name}'
multinode_test.go:572: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-309580 -- exec busybox-7b57f96db7-6zvv6 -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
multinode_test.go:583: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-309580 -- exec busybox-7b57f96db7-6zvv6 -- sh -c "ping -c 1 192.168.67.1"
multinode_test.go:572: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-309580 -- exec busybox-7b57f96db7-b67nf -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
multinode_test.go:583: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-309580 -- exec busybox-7b57f96db7-b67nf -- sh -c "ping -c 1 192.168.67.1"
--- PASS: TestMultiNode/serial/PingHostFrom2Pods (0.96s)

                                                
                                    
x
+
TestMultiNode/serial/AddNode (27.69s)

                                                
                                                
=== RUN   TestMultiNode/serial/AddNode
multinode_test.go:121: (dbg) Run:  out/minikube-linux-arm64 node add -p multinode-309580 -v=5 --alsologtostderr
multinode_test.go:121: (dbg) Done: out/minikube-linux-arm64 node add -p multinode-309580 -v=5 --alsologtostderr: (26.942498834s)
multinode_test.go:127: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 status --alsologtostderr
--- PASS: TestMultiNode/serial/AddNode (27.69s)

                                                
                                    
x
+
TestMultiNode/serial/MultiNodeLabels (0.09s)

                                                
                                                
=== RUN   TestMultiNode/serial/MultiNodeLabels
multinode_test.go:221: (dbg) Run:  kubectl --context multinode-309580 get nodes -o "jsonpath=[{range .items[*]}{.metadata.labels},{end}]"
--- PASS: TestMultiNode/serial/MultiNodeLabels (0.09s)

                                                
                                    
x
+
TestMultiNode/serial/ProfileList (0.73s)

                                                
                                                
=== RUN   TestMultiNode/serial/ProfileList
multinode_test.go:143: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
--- PASS: TestMultiNode/serial/ProfileList (0.73s)

                                                
                                    
x
+
TestMultiNode/serial/CopyFile (10.88s)

                                                
                                                
=== RUN   TestMultiNode/serial/CopyFile
multinode_test.go:184: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 status --output json --alsologtostderr
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 cp testdata/cp-test.txt multinode-309580:/home/docker/cp-test.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 ssh -n multinode-309580 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 cp multinode-309580:/home/docker/cp-test.txt /tmp/TestMultiNodeserialCopyFile2905995742/001/cp-test_multinode-309580.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 ssh -n multinode-309580 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 cp multinode-309580:/home/docker/cp-test.txt multinode-309580-m02:/home/docker/cp-test_multinode-309580_multinode-309580-m02.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 ssh -n multinode-309580 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 ssh -n multinode-309580-m02 "sudo cat /home/docker/cp-test_multinode-309580_multinode-309580-m02.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 cp multinode-309580:/home/docker/cp-test.txt multinode-309580-m03:/home/docker/cp-test_multinode-309580_multinode-309580-m03.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 ssh -n multinode-309580 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 ssh -n multinode-309580-m03 "sudo cat /home/docker/cp-test_multinode-309580_multinode-309580-m03.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 cp testdata/cp-test.txt multinode-309580-m02:/home/docker/cp-test.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 ssh -n multinode-309580-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 cp multinode-309580-m02:/home/docker/cp-test.txt /tmp/TestMultiNodeserialCopyFile2905995742/001/cp-test_multinode-309580-m02.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 ssh -n multinode-309580-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 cp multinode-309580-m02:/home/docker/cp-test.txt multinode-309580:/home/docker/cp-test_multinode-309580-m02_multinode-309580.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 ssh -n multinode-309580-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 ssh -n multinode-309580 "sudo cat /home/docker/cp-test_multinode-309580-m02_multinode-309580.txt"
E1124 10:01:03.605202 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 cp multinode-309580-m02:/home/docker/cp-test.txt multinode-309580-m03:/home/docker/cp-test_multinode-309580-m02_multinode-309580-m03.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 ssh -n multinode-309580-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 ssh -n multinode-309580-m03 "sudo cat /home/docker/cp-test_multinode-309580-m02_multinode-309580-m03.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 cp testdata/cp-test.txt multinode-309580-m03:/home/docker/cp-test.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 ssh -n multinode-309580-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 cp multinode-309580-m03:/home/docker/cp-test.txt /tmp/TestMultiNodeserialCopyFile2905995742/001/cp-test_multinode-309580-m03.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 ssh -n multinode-309580-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 cp multinode-309580-m03:/home/docker/cp-test.txt multinode-309580:/home/docker/cp-test_multinode-309580-m03_multinode-309580.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 ssh -n multinode-309580-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 ssh -n multinode-309580 "sudo cat /home/docker/cp-test_multinode-309580-m03_multinode-309580.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 cp multinode-309580-m03:/home/docker/cp-test.txt multinode-309580-m02:/home/docker/cp-test_multinode-309580-m03_multinode-309580-m02.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 ssh -n multinode-309580-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 ssh -n multinode-309580-m02 "sudo cat /home/docker/cp-test_multinode-309580-m03_multinode-309580-m02.txt"
E1124 10:01:07.788780 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
--- PASS: TestMultiNode/serial/CopyFile (10.88s)

                                                
                                    
x
+
TestMultiNode/serial/StopNode (2.42s)

                                                
                                                
=== RUN   TestMultiNode/serial/StopNode
multinode_test.go:248: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 node stop m03
multinode_test.go:248: (dbg) Done: out/minikube-linux-arm64 -p multinode-309580 node stop m03: (1.300120836s)
multinode_test.go:254: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 status
multinode_test.go:254: (dbg) Non-zero exit: out/minikube-linux-arm64 -p multinode-309580 status: exit status 7 (568.093169ms)

                                                
                                                
-- stdout --
	multinode-309580
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	multinode-309580-m02
	type: Worker
	host: Running
	kubelet: Running
	
	multinode-309580-m03
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
multinode_test.go:261: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 status --alsologtostderr
multinode_test.go:261: (dbg) Non-zero exit: out/minikube-linux-arm64 -p multinode-309580 status --alsologtostderr: exit status 7 (553.790583ms)

                                                
                                                
-- stdout --
	multinode-309580
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	multinode-309580-m02
	type: Worker
	host: Running
	kubelet: Running
	
	multinode-309580-m03
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1124 10:01:09.971760 1812918 out.go:360] Setting OutFile to fd 1 ...
	I1124 10:01:09.971996 1812918 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 10:01:09.972023 1812918 out.go:374] Setting ErrFile to fd 2...
	I1124 10:01:09.972042 1812918 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 10:01:09.972437 1812918 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
	I1124 10:01:09.972718 1812918 out.go:368] Setting JSON to false
	I1124 10:01:09.972769 1812918 mustload.go:66] Loading cluster: multinode-309580
	I1124 10:01:09.973470 1812918 config.go:182] Loaded profile config "multinode-309580": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
	I1124 10:01:09.973507 1812918 status.go:174] checking status of multinode-309580 ...
	I1124 10:01:09.974703 1812918 notify.go:221] Checking for updates...
	I1124 10:01:09.974821 1812918 cli_runner.go:164] Run: docker container inspect multinode-309580 --format={{.State.Status}}
	I1124 10:01:09.998075 1812918 status.go:371] multinode-309580 host status = "Running" (err=<nil>)
	I1124 10:01:09.998102 1812918 host.go:66] Checking if "multinode-309580" exists ...
	I1124 10:01:09.998397 1812918 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" multinode-309580
	I1124 10:01:10.029437 1812918 host.go:66] Checking if "multinode-309580" exists ...
	I1124 10:01:10.029864 1812918 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1124 10:01:10.029996 1812918 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" multinode-309580
	I1124 10:01:10.053165 1812918 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34809 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/multinode-309580/id_rsa Username:docker}
	I1124 10:01:10.161278 1812918 ssh_runner.go:195] Run: systemctl --version
	I1124 10:01:10.168369 1812918 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1124 10:01:10.182063 1812918 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1124 10:01:10.244738 1812918 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:3 ContainersRunning:2 ContainersPaused:0 ContainersStopped:1 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:50 OomKillDisable:true NGoroutines:62 SystemTime:2025-11-24 10:01:10.234025314 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214831104 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1124 10:01:10.245697 1812918 kubeconfig.go:125] found "multinode-309580" server: "https://192.168.67.2:8443"
	I1124 10:01:10.245784 1812918 api_server.go:166] Checking apiserver status ...
	I1124 10:01:10.245841 1812918 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1124 10:01:10.258310 1812918 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1363/cgroup
	I1124 10:01:10.267318 1812918 api_server.go:182] apiserver freezer: "11:freezer:/docker/195022f51db7ed617f9df18ff7acdfb2eef89e35a5ad42e9c42a3c92ca86bdb5/kubepods/burstable/pod02bc45f38a0e4a7f18537745f8223f8d/a685bbf8a690e26113e88a0e75b0a2bce6e29de3b204f616b694dc07badb5ccf"
	I1124 10:01:10.267397 1812918 ssh_runner.go:195] Run: sudo cat /sys/fs/cgroup/freezer/docker/195022f51db7ed617f9df18ff7acdfb2eef89e35a5ad42e9c42a3c92ca86bdb5/kubepods/burstable/pod02bc45f38a0e4a7f18537745f8223f8d/a685bbf8a690e26113e88a0e75b0a2bce6e29de3b204f616b694dc07badb5ccf/freezer.state
	I1124 10:01:10.275927 1812918 api_server.go:204] freezer state: "THAWED"
	I1124 10:01:10.276005 1812918 api_server.go:253] Checking apiserver healthz at https://192.168.67.2:8443/healthz ...
	I1124 10:01:10.284533 1812918 api_server.go:279] https://192.168.67.2:8443/healthz returned 200:
	ok
	I1124 10:01:10.284564 1812918 status.go:463] multinode-309580 apiserver status = Running (err=<nil>)
	I1124 10:01:10.284603 1812918 status.go:176] multinode-309580 status: &{Name:multinode-309580 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1124 10:01:10.284626 1812918 status.go:174] checking status of multinode-309580-m02 ...
	I1124 10:01:10.284963 1812918 cli_runner.go:164] Run: docker container inspect multinode-309580-m02 --format={{.State.Status}}
	I1124 10:01:10.302855 1812918 status.go:371] multinode-309580-m02 host status = "Running" (err=<nil>)
	I1124 10:01:10.302889 1812918 host.go:66] Checking if "multinode-309580-m02" exists ...
	I1124 10:01:10.303231 1812918 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" multinode-309580-m02
	I1124 10:01:10.320350 1812918 host.go:66] Checking if "multinode-309580-m02" exists ...
	I1124 10:01:10.320672 1812918 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1124 10:01:10.320727 1812918 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" multinode-309580-m02
	I1124 10:01:10.338195 1812918 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34814 SSHKeyPath:/home/jenkins/minikube-integration/21978-1652607/.minikube/machines/multinode-309580-m02/id_rsa Username:docker}
	I1124 10:01:10.443679 1812918 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1124 10:01:10.458124 1812918 status.go:176] multinode-309580-m02 status: &{Name:multinode-309580-m02 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}
	I1124 10:01:10.458170 1812918 status.go:174] checking status of multinode-309580-m03 ...
	I1124 10:01:10.458528 1812918 cli_runner.go:164] Run: docker container inspect multinode-309580-m03 --format={{.State.Status}}
	I1124 10:01:10.475427 1812918 status.go:371] multinode-309580-m03 host status = "Stopped" (err=<nil>)
	I1124 10:01:10.475453 1812918 status.go:384] host is not running, skipping remaining checks
	I1124 10:01:10.475460 1812918 status.go:176] multinode-309580-m03 status: &{Name:multinode-309580-m03 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiNode/serial/StopNode (2.42s)

                                                
                                    
x
+
TestMultiNode/serial/StartAfterStop (7.94s)

                                                
                                                
=== RUN   TestMultiNode/serial/StartAfterStop
multinode_test.go:282: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 node start m03 -v=5 --alsologtostderr
multinode_test.go:282: (dbg) Done: out/minikube-linux-arm64 -p multinode-309580 node start m03 -v=5 --alsologtostderr: (7.13658694s)
multinode_test.go:290: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 status -v=5 --alsologtostderr
multinode_test.go:306: (dbg) Run:  kubectl get nodes
--- PASS: TestMultiNode/serial/StartAfterStop (7.94s)

                                                
                                    
x
+
TestMultiNode/serial/RestartKeepsNodes (78.67s)

                                                
                                                
=== RUN   TestMultiNode/serial/RestartKeepsNodes
multinode_test.go:314: (dbg) Run:  out/minikube-linux-arm64 node list -p multinode-309580
multinode_test.go:321: (dbg) Run:  out/minikube-linux-arm64 stop -p multinode-309580
E1124 10:01:24.717016 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
multinode_test.go:321: (dbg) Done: out/minikube-linux-arm64 stop -p multinode-309580: (25.145097422s)
multinode_test.go:326: (dbg) Run:  out/minikube-linux-arm64 start -p multinode-309580 --wait=true -v=5 --alsologtostderr
E1124 10:02:26.677714 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
multinode_test.go:326: (dbg) Done: out/minikube-linux-arm64 start -p multinode-309580 --wait=true -v=5 --alsologtostderr: (53.400468717s)
multinode_test.go:331: (dbg) Run:  out/minikube-linux-arm64 node list -p multinode-309580
--- PASS: TestMultiNode/serial/RestartKeepsNodes (78.67s)

                                                
                                    
x
+
TestMultiNode/serial/DeleteNode (5.75s)

                                                
                                                
=== RUN   TestMultiNode/serial/DeleteNode
multinode_test.go:416: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 node delete m03
multinode_test.go:416: (dbg) Done: out/minikube-linux-arm64 -p multinode-309580 node delete m03: (4.981829934s)
multinode_test.go:422: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 status --alsologtostderr
multinode_test.go:436: (dbg) Run:  kubectl get nodes
multinode_test.go:444: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiNode/serial/DeleteNode (5.75s)

                                                
                                    
x
+
TestMultiNode/serial/StopMultiNode (24.17s)

                                                
                                                
=== RUN   TestMultiNode/serial/StopMultiNode
multinode_test.go:345: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 stop
multinode_test.go:345: (dbg) Done: out/minikube-linux-arm64 -p multinode-309580 stop: (23.96521631s)
multinode_test.go:351: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 status
multinode_test.go:351: (dbg) Non-zero exit: out/minikube-linux-arm64 -p multinode-309580 status: exit status 7 (101.799539ms)

                                                
                                                
-- stdout --
	multinode-309580
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	multinode-309580-m02
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
multinode_test.go:358: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 status --alsologtostderr
multinode_test.go:358: (dbg) Non-zero exit: out/minikube-linux-arm64 -p multinode-309580 status --alsologtostderr: exit status 7 (103.087364ms)

                                                
                                                
-- stdout --
	multinode-309580
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	multinode-309580-m02
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1124 10:03:06.963484 1821747 out.go:360] Setting OutFile to fd 1 ...
	I1124 10:03:06.963638 1821747 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 10:03:06.963672 1821747 out.go:374] Setting ErrFile to fd 2...
	I1124 10:03:06.963685 1821747 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 10:03:06.963937 1821747 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
	I1124 10:03:06.964147 1821747 out.go:368] Setting JSON to false
	I1124 10:03:06.964193 1821747 mustload.go:66] Loading cluster: multinode-309580
	I1124 10:03:06.964283 1821747 notify.go:221] Checking for updates...
	I1124 10:03:06.964641 1821747 config.go:182] Loaded profile config "multinode-309580": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
	I1124 10:03:06.964661 1821747 status.go:174] checking status of multinode-309580 ...
	I1124 10:03:06.965219 1821747 cli_runner.go:164] Run: docker container inspect multinode-309580 --format={{.State.Status}}
	I1124 10:03:06.985713 1821747 status.go:371] multinode-309580 host status = "Stopped" (err=<nil>)
	I1124 10:03:06.985738 1821747 status.go:384] host is not running, skipping remaining checks
	I1124 10:03:06.985746 1821747 status.go:176] multinode-309580 status: &{Name:multinode-309580 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1124 10:03:06.985775 1821747 status.go:174] checking status of multinode-309580-m02 ...
	I1124 10:03:06.986077 1821747 cli_runner.go:164] Run: docker container inspect multinode-309580-m02 --format={{.State.Status}}
	I1124 10:03:07.014223 1821747 status.go:371] multinode-309580-m02 host status = "Stopped" (err=<nil>)
	I1124 10:03:07.014242 1821747 status.go:384] host is not running, skipping remaining checks
	I1124 10:03:07.014254 1821747 status.go:176] multinode-309580-m02 status: &{Name:multinode-309580-m02 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiNode/serial/StopMultiNode (24.17s)

                                                
                                    
x
+
TestMultiNode/serial/RestartMultiNode (52.01s)

                                                
                                                
=== RUN   TestMultiNode/serial/RestartMultiNode
multinode_test.go:376: (dbg) Run:  out/minikube-linux-arm64 start -p multinode-309580 --wait=true -v=5 --alsologtostderr --driver=docker  --container-runtime=containerd
E1124 10:03:14.515589 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
multinode_test.go:376: (dbg) Done: out/minikube-linux-arm64 start -p multinode-309580 --wait=true -v=5 --alsologtostderr --driver=docker  --container-runtime=containerd: (51.302401042s)
multinode_test.go:382: (dbg) Run:  out/minikube-linux-arm64 -p multinode-309580 status --alsologtostderr
multinode_test.go:396: (dbg) Run:  kubectl get nodes
multinode_test.go:404: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiNode/serial/RestartMultiNode (52.01s)

                                                
                                    
x
+
TestMultiNode/serial/ValidateNameConflict (37.79s)

                                                
                                                
=== RUN   TestMultiNode/serial/ValidateNameConflict
multinode_test.go:455: (dbg) Run:  out/minikube-linux-arm64 node list -p multinode-309580
multinode_test.go:464: (dbg) Run:  out/minikube-linux-arm64 start -p multinode-309580-m02 --driver=docker  --container-runtime=containerd
multinode_test.go:464: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p multinode-309580-m02 --driver=docker  --container-runtime=containerd: exit status 14 (94.51073ms)

                                                
                                                
-- stdout --
	* [multinode-309580-m02] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=21978
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/21978-1652607/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/21978-1652607/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! Profile name 'multinode-309580-m02' is duplicated with machine name 'multinode-309580-m02' in profile 'multinode-309580'
	X Exiting due to MK_USAGE: Profile name should be unique

                                                
                                                
** /stderr **
multinode_test.go:472: (dbg) Run:  out/minikube-linux-arm64 start -p multinode-309580-m03 --driver=docker  --container-runtime=containerd
multinode_test.go:472: (dbg) Done: out/minikube-linux-arm64 start -p multinode-309580-m03 --driver=docker  --container-runtime=containerd: (35.161845812s)
multinode_test.go:479: (dbg) Run:  out/minikube-linux-arm64 node add -p multinode-309580
multinode_test.go:479: (dbg) Non-zero exit: out/minikube-linux-arm64 node add -p multinode-309580: exit status 80 (348.198072ms)

                                                
                                                
-- stdout --
	* Adding node m03 to cluster multinode-309580 as [worker]
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to GUEST_NODE_ADD: failed to add node: Node multinode-309580-m03 already exists in multinode-309580-m03 profile
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_node_040ea7097fd6ed71e65be9a474587f81f0ccd21d_1.log                    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
multinode_test.go:484: (dbg) Run:  out/minikube-linux-arm64 delete -p multinode-309580-m03
multinode_test.go:484: (dbg) Done: out/minikube-linux-arm64 delete -p multinode-309580-m03: (2.129229945s)
--- PASS: TestMultiNode/serial/ValidateNameConflict (37.79s)

                                                
                                    
x
+
TestPreload (119.87s)

                                                
                                                
=== RUN   TestPreload
preload_test.go:43: (dbg) Run:  out/minikube-linux-arm64 start -p test-preload-052687 --memory=3072 --alsologtostderr --wait=true --preload=false --driver=docker  --container-runtime=containerd --kubernetes-version=v1.32.0
preload_test.go:43: (dbg) Done: out/minikube-linux-arm64 start -p test-preload-052687 --memory=3072 --alsologtostderr --wait=true --preload=false --driver=docker  --container-runtime=containerd --kubernetes-version=v1.32.0: (53.750602904s)
preload_test.go:51: (dbg) Run:  out/minikube-linux-arm64 -p test-preload-052687 image pull gcr.io/k8s-minikube/busybox
preload_test.go:51: (dbg) Done: out/minikube-linux-arm64 -p test-preload-052687 image pull gcr.io/k8s-minikube/busybox: (2.25133115s)
preload_test.go:57: (dbg) Run:  out/minikube-linux-arm64 stop -p test-preload-052687
preload_test.go:57: (dbg) Done: out/minikube-linux-arm64 stop -p test-preload-052687: (5.812135685s)
preload_test.go:65: (dbg) Run:  out/minikube-linux-arm64 start -p test-preload-052687 --memory=3072 --alsologtostderr -v=1 --wait=true --driver=docker  --container-runtime=containerd
E1124 10:06:03.604212 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/addons-674149/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1124 10:06:24.716892 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-941011/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
preload_test.go:65: (dbg) Done: out/minikube-linux-arm64 start -p test-preload-052687 --memory=3072 --alsologtostderr -v=1 --wait=true --driver=docker  --container-runtime=containerd: (55.308216314s)
preload_test.go:70: (dbg) Run:  out/minikube-linux-arm64 -p test-preload-052687 image list
helpers_test.go:175: Cleaning up "test-preload-052687" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p test-preload-052687
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p test-preload-052687: (2.493613302s)
--- PASS: TestPreload (119.87s)

                                                
                                    
x
+
TestScheduledStopUnix (111.36s)

                                                
                                                
=== RUN   TestScheduledStopUnix
scheduled_stop_test.go:128: (dbg) Run:  out/minikube-linux-arm64 start -p scheduled-stop-463162 --memory=3072 --driver=docker  --container-runtime=containerd
scheduled_stop_test.go:128: (dbg) Done: out/minikube-linux-arm64 start -p scheduled-stop-463162 --memory=3072 --driver=docker  --container-runtime=containerd: (35.52585083s)
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-linux-arm64 stop -p scheduled-stop-463162 --schedule 5m -v=5 --alsologtostderr
minikube stop output:

                                                
                                                
** stderr ** 
	I1124 10:07:16.570018 1837672 out.go:360] Setting OutFile to fd 1 ...
	I1124 10:07:16.570144 1837672 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 10:07:16.570157 1837672 out.go:374] Setting ErrFile to fd 2...
	I1124 10:07:16.570163 1837672 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 10:07:16.570423 1837672 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
	I1124 10:07:16.570727 1837672 out.go:368] Setting JSON to false
	I1124 10:07:16.570844 1837672 mustload.go:66] Loading cluster: scheduled-stop-463162
	I1124 10:07:16.571241 1837672 config.go:182] Loaded profile config "scheduled-stop-463162": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
	I1124 10:07:16.571317 1837672 profile.go:143] Saving config to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/scheduled-stop-463162/config.json ...
	I1124 10:07:16.571612 1837672 mustload.go:66] Loading cluster: scheduled-stop-463162
	I1124 10:07:16.571794 1837672 config.go:182] Loaded profile config "scheduled-stop-463162": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2

                                                
                                                
** /stderr **
scheduled_stop_test.go:204: (dbg) Run:  out/minikube-linux-arm64 status --format={{.TimeToStop}} -p scheduled-stop-463162 -n scheduled-stop-463162
scheduled_stop_test.go:172: signal error was:  <nil>
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-linux-arm64 stop -p scheduled-stop-463162 --schedule 15s -v=5 --alsologtostderr
minikube stop output:

                                                
                                                
** stderr ** 
	I1124 10:07:17.036233 1837758 out.go:360] Setting OutFile to fd 1 ...
	I1124 10:07:17.036440 1837758 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 10:07:17.036474 1837758 out.go:374] Setting ErrFile to fd 2...
	I1124 10:07:17.036495 1837758 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 10:07:17.036893 1837758 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
	I1124 10:07:17.037280 1837758 out.go:368] Setting JSON to false
	I1124 10:07:17.037656 1837758 daemonize_unix.go:73] killing process 1837687 as it is an old scheduled stop
	I1124 10:07:17.038018 1837758 mustload.go:66] Loading cluster: scheduled-stop-463162
	I1124 10:07:17.038526 1837758 config.go:182] Loaded profile config "scheduled-stop-463162": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
	I1124 10:07:17.038736 1837758 profile.go:143] Saving config to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/scheduled-stop-463162/config.json ...
	I1124 10:07:17.038998 1837758 mustload.go:66] Loading cluster: scheduled-stop-463162
	I1124 10:07:17.039243 1837758 config.go:182] Loaded profile config "scheduled-stop-463162": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2

                                                
                                                
** /stderr **
scheduled_stop_test.go:172: signal error was:  os: process already finished
I1124 10:07:17.048595 1654467 retry.go:31] will retry after 88.277µs: open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/scheduled-stop-463162/pid: no such file or directory
I1124 10:07:17.049482 1654467 retry.go:31] will retry after 109.191µs: open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/scheduled-stop-463162/pid: no such file or directory
I1124 10:07:17.050619 1654467 retry.go:31] will retry after 282.218µs: open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/scheduled-stop-463162/pid: no such file or directory
I1124 10:07:17.051746 1654467 retry.go:31] will retry after 494.819µs: open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/scheduled-stop-463162/pid: no such file or directory
I1124 10:07:17.052838 1654467 retry.go:31] will retry after 400.476µs: open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/scheduled-stop-463162/pid: no such file or directory
I1124 10:07:17.054556 1654467 retry.go:31] will retry after 627.83µs: open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/scheduled-stop-463162/pid: no such file or directory
I1124 10:07:17.055637 1654467 retry.go:31] will retry after 1.208766ms: open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/scheduled-stop-463162/pid: no such file or directory
I1124 10:07:17.057844 1654467 retry.go:31] will retry after 1.029223ms: open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/scheduled-stop-463162/pid: no such file or directory
I1124 10:07:17.058981 1654467 retry.go:31] will retry after 1.395927ms: open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/scheduled-stop-463162/pid: no such file or directory
I1124 10:07:17.061313 1654467 retry.go:31] will retry after 5.145034ms: open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/scheduled-stop-463162/pid: no such file or directory
I1124 10:07:17.067557 1654467 retry.go:31] will retry after 6.560935ms: open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/scheduled-stop-463162/pid: no such file or directory
I1124 10:07:17.076035 1654467 retry.go:31] will retry after 8.192658ms: open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/scheduled-stop-463162/pid: no such file or directory
I1124 10:07:17.085744 1654467 retry.go:31] will retry after 17.986137ms: open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/scheduled-stop-463162/pid: no such file or directory
I1124 10:07:17.103971 1654467 retry.go:31] will retry after 14.080627ms: open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/scheduled-stop-463162/pid: no such file or directory
I1124 10:07:17.118164 1654467 retry.go:31] will retry after 35.907738ms: open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/scheduled-stop-463162/pid: no such file or directory
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-linux-arm64 stop -p scheduled-stop-463162 --cancel-scheduled
minikube stop output:

                                                
                                                
-- stdout --
	* All existing scheduled stops cancelled

                                                
                                                
-- /stdout --
scheduled_stop_test.go:189: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p scheduled-stop-463162 -n scheduled-stop-463162
scheduled_stop_test.go:218: (dbg) Run:  out/minikube-linux-arm64 status -p scheduled-stop-463162
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-linux-arm64 stop -p scheduled-stop-463162 --schedule 15s -v=5 --alsologtostderr
minikube stop output:

                                                
                                                
** stderr ** 
	I1124 10:07:42.999187 1838439 out.go:360] Setting OutFile to fd 1 ...
	I1124 10:07:42.999400 1838439 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 10:07:42.999431 1838439 out.go:374] Setting ErrFile to fd 2...
	I1124 10:07:42.999451 1838439 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1124 10:07:42.999747 1838439 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21978-1652607/.minikube/bin
	I1124 10:07:43.000014 1838439 out.go:368] Setting JSON to false
	I1124 10:07:43.000142 1838439 mustload.go:66] Loading cluster: scheduled-stop-463162
	I1124 10:07:43.000530 1838439 config.go:182] Loaded profile config "scheduled-stop-463162": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
	I1124 10:07:43.000633 1838439 profile.go:143] Saving config to /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/scheduled-stop-463162/config.json ...
	I1124 10:07:43.000949 1838439 mustload.go:66] Loading cluster: scheduled-stop-463162
	I1124 10:07:43.001144 1838439 config.go:182] Loaded profile config "scheduled-stop-463162": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2

                                                
                                                
** /stderr **
scheduled_stop_test.go:172: signal error was:  os: process already finished
E1124 10:08:14.515613 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
scheduled_stop_test.go:218: (dbg) Run:  out/minikube-linux-arm64 status -p scheduled-stop-463162
scheduled_stop_test.go:218: (dbg) Non-zero exit: out/minikube-linux-arm64 status -p scheduled-stop-463162: exit status 7 (73.460952ms)

                                                
                                                
-- stdout --
	scheduled-stop-463162
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	

                                                
                                                
-- /stdout --
scheduled_stop_test.go:189: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p scheduled-stop-463162 -n scheduled-stop-463162
scheduled_stop_test.go:189: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p scheduled-stop-463162 -n scheduled-stop-463162: exit status 7 (74.152135ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
scheduled_stop_test.go:189: status error: exit status 7 (may be ok)
helpers_test.go:175: Cleaning up "scheduled-stop-463162" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p scheduled-stop-463162
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p scheduled-stop-463162: (4.183481086s)
--- PASS: TestScheduledStopUnix (111.36s)

                                                
                                    
x
+
TestInsufficientStorage (13.19s)

                                                
                                                
=== RUN   TestInsufficientStorage
status_test.go:50: (dbg) Run:  out/minikube-linux-arm64 start -p insufficient-storage-568457 --memory=3072 --output=json --wait=true --driver=docker  --container-runtime=containerd
status_test.go:50: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p insufficient-storage-568457 --memory=3072 --output=json --wait=true --driver=docker  --container-runtime=containerd: exit status 26 (10.501099033s)

                                                
                                                
-- stdout --
	{"specversion":"1.0","id":"0eb2d0e5-2037-4e05-a94c-d73c6a39f41a","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"0","message":"[insufficient-storage-568457] minikube v1.37.0 on Ubuntu 20.04 (arm64)","name":"Initial Minikube Setup","totalsteps":"19"}}
	{"specversion":"1.0","id":"1c072b02-b201-44b1-a43c-3d2928250587","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_LOCATION=21978"}}
	{"specversion":"1.0","id":"fe36f842-ff34-4add-af7b-3a1da0268488","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true"}}
	{"specversion":"1.0","id":"3e25b547-eb59-4f5e-aae3-afb163d46d91","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"KUBECONFIG=/home/jenkins/minikube-integration/21978-1652607/kubeconfig"}}
	{"specversion":"1.0","id":"45dc1de5-15b9-403e-9351-94dc1a924c5b","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_HOME=/home/jenkins/minikube-integration/21978-1652607/.minikube"}}
	{"specversion":"1.0","id":"7246fa1a-633e-4db3-b8de-b73c50d34cfd","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_BIN=out/minikube-linux-arm64"}}
	{"specversion":"1.0","id":"a341ee54-cc95-430d-b698-d4daf882789c","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_FORCE_SYSTEMD="}}
	{"specversion":"1.0","id":"614858a7-5761-4deb-935e-530752ccc454","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_TEST_STORAGE_CAPACITY=100"}}
	{"specversion":"1.0","id":"3b54316a-cee8-4aa4-994a-addc9a495204","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_TEST_AVAILABLE_STORAGE=19"}}
	{"specversion":"1.0","id":"92ad5d8d-cb53-474c-bc69-e9b861433d35","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"1","message":"Using the docker driver based on user configuration","name":"Selecting Driver","totalsteps":"19"}}
	{"specversion":"1.0","id":"de572437-6672-4ba5-8b13-ccf8b51839c5","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"Using Docker driver with root privileges"}}
	{"specversion":"1.0","id":"72b83953-ae36-454d-b51c-e4e86d5928cb","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"3","message":"Starting \"insufficient-storage-568457\" primary control-plane node in \"insufficient-storage-568457\" cluster","name":"Starting Node","totalsteps":"19"}}
	{"specversion":"1.0","id":"2e9d11fa-7577-4b6a-988f-fc1b3eff3cf6","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"5","message":"Pulling base image v0.0.48-1763789673-21948 ...","name":"Pulling Base Image","totalsteps":"19"}}
	{"specversion":"1.0","id":"f47eedf7-aa9b-437d-bb0b-b221fa4dd7f7","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"8","message":"Creating docker container (CPUs=2, Memory=3072MB) ...","name":"Creating Container","totalsteps":"19"}}
	{"specversion":"1.0","id":"75781155-5907-4d57-be98-f4d158d1650b","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.error","datacontenttype":"application/json","data":{"advice":"Try one or more of the following to free up space on the device:\n\n\t\t\t1. Run \"docker system prune\" to remove unused Docker data (optionally with \"-a\")\n\t\t\t2. Increase the storage allocated to Docker for Desktop by clicking on:\n\t\t\t\tDocker icon \u003e Preferences \u003e Resources \u003e Disk Image Size\n\t\t\t3. Run \"minikube ssh -- docker system prune\" if using the Docker container runtime","exitcode":"26","issues":"https://github.com/kubernetes/minikube/issues/9024","message":"Docker is out of disk space! (/var is at 100% of capacity). You can pass '--force' to skip this check.","name":"RSRC_DOCKER_STORAGE","url":""}}

                                                
                                                
-- /stdout --
status_test.go:76: (dbg) Run:  out/minikube-linux-arm64 status -p insufficient-storage-568457 --output=json --layout=cluster
status_test.go:76: (dbg) Non-zero exit: out/minikube-linux-arm64 status -p insufficient-storage-568457 --output=json --layout=cluster: exit status 7 (323.239237ms)

                                                
                                                
-- stdout --
	{"Name":"insufficient-storage-568457","StatusCode":507,"StatusName":"InsufficientStorage","StatusDetail":"/var is almost out of disk space","Step":"Creating Container","StepDetail":"Creating docker container (CPUs=2, Memory=3072MB) ...","BinaryVersion":"v1.37.0","Components":{"kubeconfig":{"Name":"kubeconfig","StatusCode":500,"StatusName":"Error"}},"Nodes":[{"Name":"insufficient-storage-568457","StatusCode":507,"StatusName":"InsufficientStorage","Components":{"apiserver":{"Name":"apiserver","StatusCode":405,"StatusName":"Stopped"},"kubelet":{"Name":"kubelet","StatusCode":405,"StatusName":"Stopped"}}}]}

                                                
                                                
-- /stdout --
** stderr ** 
	E1124 10:08:43.150268 1840258 status.go:458] kubeconfig endpoint: get endpoint: "insufficient-storage-568457" does not appear in /home/jenkins/minikube-integration/21978-1652607/kubeconfig

                                                
                                                
** /stderr **
status_test.go:76: (dbg) Run:  out/minikube-linux-arm64 status -p insufficient-storage-568457 --output=json --layout=cluster
status_test.go:76: (dbg) Non-zero exit: out/minikube-linux-arm64 status -p insufficient-storage-568457 --output=json --layout=cluster: exit status 7 (309.189563ms)

                                                
                                                
-- stdout --
	{"Name":"insufficient-storage-568457","StatusCode":507,"StatusName":"InsufficientStorage","StatusDetail":"/var is almost out of disk space","BinaryVersion":"v1.37.0","Components":{"kubeconfig":{"Name":"kubeconfig","StatusCode":500,"StatusName":"Error"}},"Nodes":[{"Name":"insufficient-storage-568457","StatusCode":507,"StatusName":"InsufficientStorage","Components":{"apiserver":{"Name":"apiserver","StatusCode":405,"StatusName":"Stopped"},"kubelet":{"Name":"kubelet","StatusCode":405,"StatusName":"Stopped"}}}]}

                                                
                                                
-- /stdout --
** stderr ** 
	E1124 10:08:43.460348 1840322 status.go:458] kubeconfig endpoint: get endpoint: "insufficient-storage-568457" does not appear in /home/jenkins/minikube-integration/21978-1652607/kubeconfig
	E1124 10:08:43.470673 1840322 status.go:258] unable to read event log: stat: stat /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/insufficient-storage-568457/events.json: no such file or directory

                                                
                                                
** /stderr **
helpers_test.go:175: Cleaning up "insufficient-storage-568457" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p insufficient-storage-568457
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p insufficient-storage-568457: (2.057339142s)
--- PASS: TestInsufficientStorage (13.19s)

                                                
                                    
x
+
TestRunningBinaryUpgrade (64.15s)

                                                
                                                
=== RUN   TestRunningBinaryUpgrade
=== PAUSE TestRunningBinaryUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestRunningBinaryUpgrade
version_upgrade_test.go:120: (dbg) Run:  /tmp/minikube-v1.32.0.884314995 start -p running-upgrade-905725 --memory=3072 --vm-driver=docker  --container-runtime=containerd
version_upgrade_test.go:120: (dbg) Done: /tmp/minikube-v1.32.0.884314995 start -p running-upgrade-905725 --memory=3072 --vm-driver=docker  --container-runtime=containerd: (35.311337387s)
version_upgrade_test.go:130: (dbg) Run:  out/minikube-linux-arm64 start -p running-upgrade-905725 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd
version_upgrade_test.go:130: (dbg) Done: out/minikube-linux-arm64 start -p running-upgrade-905725 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd: (25.89504712s)
helpers_test.go:175: Cleaning up "running-upgrade-905725" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p running-upgrade-905725
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p running-upgrade-905725: (2.132581586s)
--- PASS: TestRunningBinaryUpgrade (64.15s)

                                                
                                    
x
+
TestMissingContainerUpgrade (124.68s)

                                                
                                                
=== RUN   TestMissingContainerUpgrade
=== PAUSE TestMissingContainerUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestMissingContainerUpgrade
version_upgrade_test.go:309: (dbg) Run:  /tmp/minikube-v1.32.0.2718726444 start -p missing-upgrade-383356 --memory=3072 --driver=docker  --container-runtime=containerd
version_upgrade_test.go:309: (dbg) Done: /tmp/minikube-v1.32.0.2718726444 start -p missing-upgrade-383356 --memory=3072 --driver=docker  --container-runtime=containerd: (57.082121603s)
version_upgrade_test.go:318: (dbg) Run:  docker stop missing-upgrade-383356
version_upgrade_test.go:323: (dbg) Run:  docker rm missing-upgrade-383356
version_upgrade_test.go:329: (dbg) Run:  out/minikube-linux-arm64 start -p missing-upgrade-383356 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd
version_upgrade_test.go:329: (dbg) Done: out/minikube-linux-arm64 start -p missing-upgrade-383356 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd: (1m3.453786748s)
helpers_test.go:175: Cleaning up "missing-upgrade-383356" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p missing-upgrade-383356
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p missing-upgrade-383356: (2.533514874s)
--- PASS: TestMissingContainerUpgrade (124.68s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartNoK8sWithVersion (0.09s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartNoK8sWithVersion
no_kubernetes_test.go:108: (dbg) Run:  out/minikube-linux-arm64 start -p NoKubernetes-512957 --no-kubernetes --kubernetes-version=v1.28.0 --driver=docker  --container-runtime=containerd
no_kubernetes_test.go:108: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p NoKubernetes-512957 --no-kubernetes --kubernetes-version=v1.28.0 --driver=docker  --container-runtime=containerd: exit status 14 (89.341645ms)

                                                
                                                
-- stdout --
	* [NoKubernetes-512957] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=21978
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/21978-1652607/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/21978-1652607/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to MK_USAGE: cannot specify --kubernetes-version with --no-kubernetes,
	to unset a global config run:
	
	$ minikube config unset kubernetes-version

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/StartNoK8sWithVersion (0.09s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartWithK8s (45.55s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartWithK8s
no_kubernetes_test.go:120: (dbg) Run:  out/minikube-linux-arm64 start -p NoKubernetes-512957 --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd
no_kubernetes_test.go:120: (dbg) Done: out/minikube-linux-arm64 start -p NoKubernetes-512957 --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd: (44.975038116s)
no_kubernetes_test.go:225: (dbg) Run:  out/minikube-linux-arm64 -p NoKubernetes-512957 status -o json
--- PASS: TestNoKubernetes/serial/StartWithK8s (45.55s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartWithStopK8s (24.17s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartWithStopK8s
no_kubernetes_test.go:137: (dbg) Run:  out/minikube-linux-arm64 start -p NoKubernetes-512957 --no-kubernetes --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd
no_kubernetes_test.go:137: (dbg) Done: out/minikube-linux-arm64 start -p NoKubernetes-512957 --no-kubernetes --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd: (21.874461614s)
no_kubernetes_test.go:225: (dbg) Run:  out/minikube-linux-arm64 -p NoKubernetes-512957 status -o json
no_kubernetes_test.go:225: (dbg) Non-zero exit: out/minikube-linux-arm64 -p NoKubernetes-512957 status -o json: exit status 2 (312.917977ms)

                                                
                                                
-- stdout --
	{"Name":"NoKubernetes-512957","Host":"Running","Kubelet":"Stopped","APIServer":"Stopped","Kubeconfig":"Configured","Worker":false}

                                                
                                                
-- /stdout --
no_kubernetes_test.go:149: (dbg) Run:  out/minikube-linux-arm64 delete -p NoKubernetes-512957
no_kubernetes_test.go:149: (dbg) Done: out/minikube-linux-arm64 delete -p NoKubernetes-512957: (1.982695965s)
--- PASS: TestNoKubernetes/serial/StartWithStopK8s (24.17s)

                                                
                                    
x
+
TestNoKubernetes/serial/Start (7.64s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/Start
no_kubernetes_test.go:161: (dbg) Run:  out/minikube-linux-arm64 start -p NoKubernetes-512957 --no-kubernetes --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd
no_kubernetes_test.go:161: (dbg) Done: out/minikube-linux-arm64 start -p NoKubernetes-512957 --no-kubernetes --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd: (7.642405346s)
--- PASS: TestNoKubernetes/serial/Start (7.64s)

                                                
                                    
x
+
TestNoKubernetes/serial/VerifyNok8sNoK8sDownloads (0s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/VerifyNok8sNoK8sDownloads
no_kubernetes_test.go:89: Checking cache directory: /home/jenkins/minikube-integration/21978-1652607/.minikube/cache/linux/arm64/v0.0.0
--- PASS: TestNoKubernetes/serial/VerifyNok8sNoK8sDownloads (0.00s)

                                                
                                    
x
+
TestNoKubernetes/serial/VerifyK8sNotRunning (0.28s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/VerifyK8sNotRunning
no_kubernetes_test.go:172: (dbg) Run:  out/minikube-linux-arm64 ssh -p NoKubernetes-512957 "sudo systemctl is-active --quiet service kubelet"
no_kubernetes_test.go:172: (dbg) Non-zero exit: out/minikube-linux-arm64 ssh -p NoKubernetes-512957 "sudo systemctl is-active --quiet service kubelet": exit status 1 (284.326963ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/VerifyK8sNotRunning (0.28s)

                                                
                                    
x
+
TestNoKubernetes/serial/ProfileList (0.76s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/ProfileList
no_kubernetes_test.go:194: (dbg) Run:  out/minikube-linux-arm64 profile list
no_kubernetes_test.go:204: (dbg) Run:  out/minikube-linux-arm64 profile list --output=json
--- PASS: TestNoKubernetes/serial/ProfileList (0.76s)

                                                
                                    
x
+
TestNoKubernetes/serial/Stop (1.43s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/Stop
no_kubernetes_test.go:183: (dbg) Run:  out/minikube-linux-arm64 stop -p NoKubernetes-512957
no_kubernetes_test.go:183: (dbg) Done: out/minikube-linux-arm64 stop -p NoKubernetes-512957: (1.428601706s)
--- PASS: TestNoKubernetes/serial/Stop (1.43s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartNoArgs (7.25s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartNoArgs
no_kubernetes_test.go:216: (dbg) Run:  out/minikube-linux-arm64 start -p NoKubernetes-512957 --driver=docker  --container-runtime=containerd
no_kubernetes_test.go:216: (dbg) Done: out/minikube-linux-arm64 start -p NoKubernetes-512957 --driver=docker  --container-runtime=containerd: (7.254146431s)
--- PASS: TestNoKubernetes/serial/StartNoArgs (7.25s)

                                                
                                    
x
+
TestNoKubernetes/serial/VerifyK8sNotRunningSecond (0.28s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/VerifyK8sNotRunningSecond
no_kubernetes_test.go:172: (dbg) Run:  out/minikube-linux-arm64 ssh -p NoKubernetes-512957 "sudo systemctl is-active --quiet service kubelet"
no_kubernetes_test.go:172: (dbg) Non-zero exit: out/minikube-linux-arm64 ssh -p NoKubernetes-512957 "sudo systemctl is-active --quiet service kubelet": exit status 1 (280.006622ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/VerifyK8sNotRunningSecond (0.28s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/Setup (0.82s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/Setup
--- PASS: TestStoppedBinaryUpgrade/Setup (0.82s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/Upgrade (59.48s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/Upgrade
version_upgrade_test.go:183: (dbg) Run:  /tmp/minikube-v1.32.0.428749286 start -p stopped-upgrade-216773 --memory=3072 --vm-driver=docker  --container-runtime=containerd
version_upgrade_test.go:183: (dbg) Done: /tmp/minikube-v1.32.0.428749286 start -p stopped-upgrade-216773 --memory=3072 --vm-driver=docker  --container-runtime=containerd: (37.588221665s)
version_upgrade_test.go:192: (dbg) Run:  /tmp/minikube-v1.32.0.428749286 -p stopped-upgrade-216773 stop
version_upgrade_test.go:192: (dbg) Done: /tmp/minikube-v1.32.0.428749286 -p stopped-upgrade-216773 stop: (1.244360859s)
version_upgrade_test.go:198: (dbg) Run:  out/minikube-linux-arm64 start -p stopped-upgrade-216773 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd
version_upgrade_test.go:198: (dbg) Done: out/minikube-linux-arm64 start -p stopped-upgrade-216773 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd: (20.647851402s)
--- PASS: TestStoppedBinaryUpgrade/Upgrade (59.48s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/MinikubeLogs (1.5s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/MinikubeLogs
version_upgrade_test.go:206: (dbg) Run:  out/minikube-linux-arm64 logs -p stopped-upgrade-216773
version_upgrade_test.go:206: (dbg) Done: out/minikube-linux-arm64 logs -p stopped-upgrade-216773: (1.495190077s)
--- PASS: TestStoppedBinaryUpgrade/MinikubeLogs (1.50s)

                                                
                                    
x
+
TestPause/serial/Start (87.97s)

                                                
                                                
=== RUN   TestPause/serial/Start
pause_test.go:80: (dbg) Run:  out/minikube-linux-arm64 start -p pause-508844 --memory=3072 --install-addons=false --wait=all --driver=docker  --container-runtime=containerd
E1124 10:13:14.515709 1654467 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21978-1652607/.minikube/profiles/functional-291288/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
pause_test.go:80: (dbg) Done: out/minikube-linux-arm64 start -p pause-508844 --memory=3072 --install-addons=false --wait=all --driver=docker  --container-runtime=containerd: (1m27.972492287s)
--- PASS: TestPause/serial/Start (87.97s)

                                                
                                    
x
+
TestPause/serial/SecondStartNoReconfiguration (7.9s)

                                                
                                                
=== RUN   TestPause/serial/SecondStartNoReconfiguration
pause_test.go:92: (dbg) Run:  out/minikube-linux-arm64 start -p pause-508844 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd
pause_test.go:92: (dbg) Done: out/minikube-linux-arm64 start -p pause-508844 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd: (7.879149731s)
--- PASS: TestPause/serial/SecondStartNoReconfiguration (7.90s)

                                                
                                    
x
+
TestPause/serial/Pause (0.74s)

                                                
                                                
=== RUN   TestPause/serial/Pause
pause_test.go:110: (dbg) Run:  out/minikube-linux-arm64 pause -p pause-508844 --alsologtostderr -v=5
--- PASS: TestPause/serial/Pause (0.74s)

                                                
                                    
x
+
TestPause/serial/VerifyStatus (0.47s)

                                                
                                                
=== RUN   TestPause/serial/VerifyStatus
status_test.go:76: (dbg) Run:  out/minikube-linux-arm64 status -p pause-508844 --output=json --layout=cluster
status_test.go:76: (dbg) Non-zero exit: out/minikube-linux-arm64 status -p pause-508844 --output=json --layout=cluster: exit status 2 (469.627122ms)

                                                
                                                
-- stdout --
	{"Name":"pause-508844","StatusCode":418,"StatusName":"Paused","Step":"Done","StepDetail":"* Paused 7 containers in: kube-system, kubernetes-dashboard, istio-operator","BinaryVersion":"v1.37.0","Components":{"kubeconfig":{"Name":"kubeconfig","StatusCode":200,"StatusName":"OK"}},"Nodes":[{"Name":"pause-508844","StatusCode":200,"StatusName":"OK","Components":{"apiserver":{"Name":"apiserver","StatusCode":418,"StatusName":"Paused"},"kubelet":{"Name":"kubelet","StatusCode":405,"StatusName":"Stopped"}}}]}

                                                
                                                
-- /stdout --
--- PASS: TestPause/serial/VerifyStatus (0.47s)

                                                
                                    
x
+
TestPause/serial/Unpause (0.74s)

                                                
                                                
=== RUN   TestPause/serial/Unpause
pause_test.go:121: (dbg) Run:  out/minikube-linux-arm64 unpause -p pause-508844 --alsologtostderr -v=5
--- PASS: TestPause/serial/Unpause (0.74s)

                                                
                                    
x
+
TestPause/serial/PauseAgain (0.87s)

                                                
                                                
=== RUN   TestPause/serial/PauseAgain
pause_test.go:110: (dbg) Run:  out/minikube-linux-arm64 pause -p pause-508844 --alsologtostderr -v=5
--- PASS: TestPause/serial/PauseAgain (0.87s)

                                                
                                    
x
+
TestPause/serial/DeletePaused (2.72s)

                                                
                                                
=== RUN   TestPause/serial/DeletePaused
pause_test.go:132: (dbg) Run:  out/minikube-linux-arm64 delete -p pause-508844 --alsologtostderr -v=5
pause_test.go:132: (dbg) Done: out/minikube-linux-arm64 delete -p pause-508844 --alsologtostderr -v=5: (2.722809682s)
--- PASS: TestPause/serial/DeletePaused (2.72s)

                                                
                                    
x
+
TestPause/serial/VerifyDeletedResources (0.42s)

                                                
                                                
=== RUN   TestPause/serial/VerifyDeletedResources
pause_test.go:142: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
pause_test.go:168: (dbg) Run:  docker ps -a
pause_test.go:173: (dbg) Run:  docker volume inspect pause-508844
pause_test.go:173: (dbg) Non-zero exit: docker volume inspect pause-508844: exit status 1 (17.302952ms)

                                                
                                                
-- stdout --
	[]

                                                
                                                
-- /stdout --
** stderr ** 
	Error response from daemon: get pause-508844: no such volume

                                                
                                                
** /stderr **
pause_test.go:178: (dbg) Run:  docker network ls
--- PASS: TestPause/serial/VerifyDeletedResources (0.42s)

                                                
                                    

Test skip (34/320)

Order skiped test Duration
5 TestDownloadOnly/v1.28.0/cached-images 0
6 TestDownloadOnly/v1.28.0/binaries 0
7 TestDownloadOnly/v1.28.0/kubectl 0
14 TestDownloadOnly/v1.34.2/cached-images 0
15 TestDownloadOnly/v1.34.2/binaries 0
16 TestDownloadOnly/v1.34.2/kubectl 0
22 TestDownloadOnly/v1.35.0-beta.0/preload-exists 0.15
25 TestDownloadOnly/v1.35.0-beta.0/kubectl 0
29 TestDownloadOnlyKic 0.43
31 TestOffline 0
42 TestAddons/serial/GCPAuth/RealCredentials 0.01
49 TestAddons/parallel/Olm 0
56 TestAddons/parallel/AmdGpuDevicePlugin 0
60 TestDockerFlags 0
64 TestHyperKitDriverInstallOrUpdate 0
65 TestHyperkitDriverSkipUpgrade 0
112 TestFunctional/parallel/MySQL 0
116 TestFunctional/parallel/DockerEnv 0
117 TestFunctional/parallel/PodmanEnv 0
153 TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig 0
154 TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil 0
155 TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS 0
206 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MySQL 0
210 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DockerEnv 0
211 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PodmanEnv 0
241 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/DNSResolutionByDig 0
242 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil 0
243 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/AccessThroughDNS 0
260 TestGvisorAddon 0
282 TestImageBuild 0
283 TestISOImage 0
347 TestChangeNoneUser 0
350 TestScheduledStopWindows 0
352 TestSkaffold 0
x
+
TestDownloadOnly/v1.28.0/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/cached-images
aaa_download_only_test.go:128: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.28.0/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.0/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/binaries
aaa_download_only_test.go:150: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.28.0/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.0/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/kubectl
aaa_download_only_test.go:166: Test for darwin and windows
--- SKIP: TestDownloadOnly/v1.28.0/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.2/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.2/cached-images
aaa_download_only_test.go:128: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.34.2/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.2/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.2/binaries
aaa_download_only_test.go:150: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.34.2/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.2/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.2/kubectl
aaa_download_only_test.go:166: Test for darwin and windows
--- SKIP: TestDownloadOnly/v1.34.2/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-beta.0/preload-exists (0.15s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-beta.0/preload-exists
I1124 08:43:23.557278 1654467 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
W1124 08:43:23.656597 1654467 preload.go:144] https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.35.0-beta.0/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
W1124 08:43:23.704070 1654467 preload.go:144] https://github.com/kubernetes-sigs/minikube-preloads/releases/download/v18/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
aaa_download_only_test.go:113: No preload image
--- SKIP: TestDownloadOnly/v1.35.0-beta.0/preload-exists (0.15s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-beta.0/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-beta.0/kubectl
aaa_download_only_test.go:166: Test for darwin and windows
--- SKIP: TestDownloadOnly/v1.35.0-beta.0/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnlyKic (0.43s)

                                                
                                                
=== RUN   TestDownloadOnlyKic
aaa_download_only_test.go:231: (dbg) Run:  out/minikube-linux-arm64 start --download-only -p download-docker-750165 --alsologtostderr --driver=docker  --container-runtime=containerd
aaa_download_only_test.go:248: Skip for arm64 platform. See https://github.com/kubernetes/minikube/issues/10144
helpers_test.go:175: Cleaning up "download-docker-750165" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p download-docker-750165
--- SKIP: TestDownloadOnlyKic (0.43s)

                                                
                                    
x
+
TestOffline (0s)

                                                
                                                
=== RUN   TestOffline
=== PAUSE TestOffline

                                                
                                                

                                                
                                                
=== CONT  TestOffline
aab_offline_test.go:35: skipping TestOffline - only docker runtime supported on arm64. See https://github.com/kubernetes/minikube/issues/10144
--- SKIP: TestOffline (0.00s)

                                                
                                    
x
+
TestAddons/serial/GCPAuth/RealCredentials (0.01s)

                                                
                                                
=== RUN   TestAddons/serial/GCPAuth/RealCredentials
addons_test.go:759: This test requires a GCE instance (excluding Cloud Shell) with a container based driver
--- SKIP: TestAddons/serial/GCPAuth/RealCredentials (0.01s)

                                                
                                    
x
+
TestAddons/parallel/Olm (0s)

                                                
                                                
=== RUN   TestAddons/parallel/Olm
=== PAUSE TestAddons/parallel/Olm

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Olm
addons_test.go:483: Skipping OLM addon test until https://github.com/operator-framework/operator-lifecycle-manager/issues/2534 is resolved
--- SKIP: TestAddons/parallel/Olm (0.00s)

                                                
                                    
x
+
TestAddons/parallel/AmdGpuDevicePlugin (0s)

                                                
                                                
=== RUN   TestAddons/parallel/AmdGpuDevicePlugin
=== PAUSE TestAddons/parallel/AmdGpuDevicePlugin

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/AmdGpuDevicePlugin
addons_test.go:1033: skip amd gpu test on all but docker driver and amd64 platform
--- SKIP: TestAddons/parallel/AmdGpuDevicePlugin (0.00s)

                                                
                                    
x
+
TestDockerFlags (0s)

                                                
                                                
=== RUN   TestDockerFlags
docker_test.go:41: skipping: only runs with docker container runtime, currently testing containerd
--- SKIP: TestDockerFlags (0.00s)

                                                
                                    
x
+
TestHyperKitDriverInstallOrUpdate (0s)

                                                
                                                
=== RUN   TestHyperKitDriverInstallOrUpdate
driver_install_or_update_test.go:37: Skip if not darwin.
--- SKIP: TestHyperKitDriverInstallOrUpdate (0.00s)

                                                
                                    
x
+
TestHyperkitDriverSkipUpgrade (0s)

                                                
                                                
=== RUN   TestHyperkitDriverSkipUpgrade
driver_install_or_update_test.go:101: Skip if not darwin.
--- SKIP: TestHyperkitDriverSkipUpgrade (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/MySQL (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/MySQL
=== PAUSE TestFunctional/parallel/MySQL

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/MySQL
functional_test.go:1792: arm64 is not supported by mysql. Skip the test. See https://github.com/kubernetes/minikube/issues/10144
--- SKIP: TestFunctional/parallel/MySQL (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/DockerEnv (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/DockerEnv
=== PAUSE TestFunctional/parallel/DockerEnv

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DockerEnv
functional_test.go:478: only validate docker env with docker container runtime, currently testing containerd
--- SKIP: TestFunctional/parallel/DockerEnv (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/PodmanEnv (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/PodmanEnv
=== PAUSE TestFunctional/parallel/PodmanEnv

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/PodmanEnv
functional_test.go:565: only validate podman env with docker container runtime, currently testing containerd
--- SKIP: TestFunctional/parallel/PodmanEnv (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig
functional_test_tunnel_test.go:99: DNS forwarding is only supported for Hyperkit on Darwin, skipping test DNS forwarding
--- SKIP: TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil
functional_test_tunnel_test.go:99: DNS forwarding is only supported for Hyperkit on Darwin, skipping test DNS forwarding
--- SKIP: TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS
functional_test_tunnel_test.go:99: DNS forwarding is only supported for Hyperkit on Darwin, skipping test DNS forwarding
--- SKIP: TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MySQL (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MySQL
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MySQL

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MySQL
functional_test.go:1792: arm64 is not supported by mysql. Skip the test. See https://github.com/kubernetes/minikube/issues/10144
--- SKIP: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MySQL (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DockerEnv (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DockerEnv
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DockerEnv

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DockerEnv
functional_test.go:478: only validate docker env with docker container runtime, currently testing containerd
--- SKIP: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DockerEnv (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PodmanEnv (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PodmanEnv
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PodmanEnv

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PodmanEnv
functional_test.go:565: only validate podman env with docker container runtime, currently testing containerd
--- SKIP: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PodmanEnv (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/DNSResolutionByDig (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/DNSResolutionByDig
functional_test_tunnel_test.go:99: DNS forwarding is only supported for Hyperkit on Darwin, skipping test DNS forwarding
--- SKIP: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/DNSResolutionByDig (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil
functional_test_tunnel_test.go:99: DNS forwarding is only supported for Hyperkit on Darwin, skipping test DNS forwarding
--- SKIP: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/AccessThroughDNS (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/AccessThroughDNS
functional_test_tunnel_test.go:99: DNS forwarding is only supported for Hyperkit on Darwin, skipping test DNS forwarding
--- SKIP: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/AccessThroughDNS (0.00s)

                                                
                                    
x
+
TestGvisorAddon (0s)

                                                
                                                
=== RUN   TestGvisorAddon
gvisor_addon_test.go:34: skipping test because --gvisor=false
--- SKIP: TestGvisorAddon (0.00s)

                                                
                                    
x
+
TestImageBuild (0s)

                                                
                                                
=== RUN   TestImageBuild
image_test.go:33: 
--- SKIP: TestImageBuild (0.00s)

                                                
                                    
x
+
TestISOImage (0s)

                                                
                                                
=== RUN   TestISOImage
iso_test.go:36: This test requires a VM driver
--- SKIP: TestISOImage (0.00s)

                                                
                                    
x
+
TestChangeNoneUser (0s)

                                                
                                                
=== RUN   TestChangeNoneUser
none_test.go:38: Test requires none driver and SUDO_USER env to not be empty
--- SKIP: TestChangeNoneUser (0.00s)

                                                
                                    
x
+
TestScheduledStopWindows (0s)

                                                
                                                
=== RUN   TestScheduledStopWindows
scheduled_stop_test.go:42: test only runs on windows
--- SKIP: TestScheduledStopWindows (0.00s)

                                                
                                    
x
+
TestSkaffold (0s)

                                                
                                                
=== RUN   TestSkaffold
skaffold_test.go:45: skaffold requires docker-env, currently testing containerd container runtime
--- SKIP: TestSkaffold (0.00s)

                                                
                                    
Copied to clipboard